Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD OF AN AI ASSISTED SEARCH, REPORT AND SUMMARY BASED ON TRIGGER CONTENT
Document Type and Number:
WIPO Patent Application WO/2020/106618
Kind Code:
A1
Abstract:
A method of autonomous hyper-relevant search results based on a search trigger can have the steps of receiving a search trigger from a user; analyzing, using a processor running instructions to implement at least one algorithm, the search trigger for a search element; and searching, using a processor running instructions to implement at least one algorithm and without instructions from the user, for a data point related to the search element. The data point can be converted to a data stamp and provided to the user. The method can also include converting a plurality of data points to a plurality of data stamps and compiling the plurality of data stamps to a data stack. The search element can be at least one of a time, location, word, image, name, and event.

Inventors:
LEKA DONALD (US)
Application Number:
PCT/US2019/062006
Publication Date:
May 28, 2020
Filing Date:
November 18, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
JUMPTUIT (US)
LEKA DONALD (US)
International Classes:
G06F16/95; G06F16/953; G06F16/9532; G06F16/9538
Foreign References:
US20150112963A12015-04-23
US20100159903A12010-06-24
US20150120766A12015-04-30
US20040030741A12004-02-12
Attorney, Agent or Firm:
DELJUIDICE, Louis J. (US)
Download PDF:
Claims:
What is claimed is:

1. A method of autonomous search based on a search trigger, comprising the steps of: receiving a search trigger from a user;

analyzing, using a processor running instructions to implement at least one algorithm, the search trigger for a search element;

searching, using a processor running instructions to implement at least one algorithm and without instructions from the user, for a data point related to the search element;

converting the data point to a data stamp; and

providing the data stamp to the user.

2. The method of autonomous search of claim 1, further comprising the steps of:

converting a plurality of data points to a plurality of data stamps; and

compiling the plurality of data stamps to a data stack.

3. The method of autonomous search of claim 2, further comprising the steps of:

analyzing, using a processor running instructions to implement at least one algorithm and without instructions from the user, the data stack; and

providing a data snap summary of the data stack based on the analysis.

4. The method of autonomous search of claim 1, wherein the search element is at least one of a time, location, word, image, name, and event.

5. The method of autonomous search of claim 1, further comprising the steps of:

dividing the data stamp into at least two data time slices; and

providing the divided data stamp to the user.

6. The method of autonomous search of claim 5, wherein the dividing step comprises: preparing a now slice with an up to date data point, wherein the up to date data point is a data point collected between the time the searching step is performed and a past time based on at least one of the search trigger and the search element; and

preparing a past slice with a past data point collected at the time of the searching step and the past data point comprises data from the past time or earlier.

7. The method of autonomous search of claim 6, wherein the dividing step further comprises:

preparing a future slice, comprising the steps of:

analyzing, using a processor running instructions to implement at least one algorithm and without instructions from the user, at least one of the now slice and the past slice; and

extrapolating from the analysis, using a processor running instructions to implement at least one algorithm and without instructions from the user, a future data point related and distinct from the up to date data point and the past data point.

Description:
SYSTEM AND METHOD OF AN AI ASSISTED SEARCH, REPORT

AND SUMMARY BASED ON TRIGGER CONTENT

Cross-Reference to Related Applications

This application claims priority to U.S. Provisional Application No. 62/769,321, filed November 19, 2018, which application is herein incorporated by reference in its entirety.

Field of the Invention

The present invention relates to utilizing artificial intelligence to provide a user context significant search results, a full report and a summary of the report based on trigger content.

Background

Individuals and organizations often make decisions based on incomplete and outdated information. This is both in their personal and professional spheres. In some contexts, the lack of information comes from the inability to gather all of the relevant information, both public and private, and summarize it so it is easily understood by the user. In other contexts, as time passes human memory fades, even just a short time after an event, granular details can be lost. What a user may think she knows, may not be the actual facts or have enough details to make an informed decision.

This problem is only intensified as human and machine generated metadata exponentially increases and fragments across an expanding universe of cloud services and Internet of Things (IoT) devices. This proliferation of cloud services and IoT devices has accelerated the volume of data generated by consumers and organizations to 23 billion gigabytes per day.

Another problem is timing. While users are accessing and reviewing data and content on a regular basis, it may not be relevant for days or weeks in the future. Maintaining a link to that knowledge as well as when to access it can be daunting, if not impossible.

What is needed is an autonomous AI search that dynamically matches hyper-relevant personal and public data at the point of, or prior to, some trigger which can be an event, activity or a decision that is close at hand. Hyper-relevant search results become the basis for highly targeted Autonomous AI searches that generate precision research with comprehensive data and additionally provides summaries, insights and predictions. Summary

A method of autonomous hyper-relevant search results based on a search trigger can have the steps of receiving a search trigger from a user; analyzing, using a processor running instructions to implement at least one algorithm, the search trigger for a search element; and searching, using a processor running instructions to implement at least one algorithm and without instructions from the user, for a data point related to the search element. The data point can be converted to a data stamp and provided to the user. The method can also include converting a plurality of data points to a plurality of data stamps and compiling the plurality of data stamps to a data stack. The search element can be at least one of a time, location, word, image, name, and event.

The system can also perform the step of analyzing, using a processor running instructions to implement at least one algorithm and without instructions from the user, the data stack. A data snap summary of the data stack can be provided based on the analysis.

The method can further divide the data stamp into at least two data time slices and providing the divided data stamp to the user. In dividing the data stamp (stack, snap) a now slice can be prepared with an up to date data point. An up to date data point can be a data point collected between the time the searching step is performed and a past time based on at least one of the search trigger and the search element. Additionally, a past slice can be prepared with a past data point. The past data point can be collected at the time of the searching step and the past data point can have data from the past time or earlier.

A future slice can be prepared using a method of analyzing, using a processor running instructions to implement at least one algorithm and without instructions from the user, at least one of the now slice and the past slice. The method can then extrapolate from the analysis, using a processor running instructions to implement at least one algorithm and without instructions from the user, a future data point related and distinct from the up to date data point and the past data point.

Brief Description of the Drawings

This invention is described with particularity in the appended claims. The above and further aspects of this invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like numerals indicate like structural elements and features in various figures. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention. The drawing figures depict one or more implementations in accord with the present teachings, by way of example only, not by way of limitation.

Figure 1 is a schematic overview of the system of the present invention;

Figure 2 illustrates a data reticulation process and converting metadata to correlated metadata;

Figure 3 illustrates an example of an AI E/L search assistant;

Figure 4 illustrates an example of a typical calendar event;

Figure 5 illustrates an example of an event/location result;

Figure 6 illustrates an example of an interval;

Figure 7 illustrates an example of a data stamp;

Figure 8 illustrates an example of a data stack and data snap;

Figure 9 illustrates an example of a location search trigger;

Figure 10 illustrates an example of data time slices;

Figure 11 illustrates an example of the method of creating data stamps and stacks;

Figure 12 illustrates an example of the method of creating data snaps; and

Figure 13 illustrates an example of the method of creating data slices.

Detailed Description

Turning to Figure 1, an overview of a system 100 to access content and collect metadata is illustrated. A user 10 can have any number of internet connect devices 12, including a laptop 12a, a smartphone 12b, a tablet 12c, a smart speaker 12d, an internet connected watch (not illustrated), smart car (not illustrated), smart appliance (not illustrated), smart TV (not illustrated), and all other networked content creation and delivery devices. All or most of a user’s 10 devices 12 can also have location tracking hardware or software 13. One example of location tracking hardware 13 uses a GPS (“global positioning system”) chip to track the user’s location almost anywhere in the world. Other location tracking applications can use existing cellular towers and triangulate the user’s position based on signal strength determinations from multiple towers. Other position location techniques can include knowing the location of the wireless access point the user 10 is accessing to access the internet 14. Additionally, there are other methods well known in the art.

The user 10 can interact with the devices 12 which in turn are connected to the internet 14 or other networks: public, private, or worldwide. These connections allow the user to access content 16 broadly or utilize any number of services 18, including file storage 20, email servers 22, social media 24, and collaboration and content sharing 26, calendar 28, and gaming platforms (not illustrated), as just an example of the myriad of on-line services and accounts available to the user 10. The user 10 then can permit the system 100 to access her content 16 and services 18.

The system 100 can have a scanning engine 102, storage 104, analysis engine 106, search engine 108, security exchange 110, and display engine 112. Discussions of the these and other aspects of the system 100 are incorporated herein by reference from co-pending application number 15/950,866, filed April 11, 2018 and titled“System and Method of Correlating Multiple Data Points to Create a New Single Data Point”.

As is known in the art, all or most individual pieces of data 200 have metadata 202 attached. The metadata 202 describes and provides information about the data 200 without needing to access the data 200 itself. In one example, the scanning engine 102 can just extract the metadata 202 associated with each piece of data 200 and store the metadata 202 in the memory 104. The scanning engine 102 can also take an individual piece of data 200 and create new metadata 204 based on its own scanning and processing algorithm, which can be also stored in the memory 104. The analysis engine 106 can review the metadata 202, 204 and create additional correlated data points 206 relating the data 200. The correlated data points 206 can be generated from a combination of metadata 202, 204 and interpreting the information therein. The scanning engine 102, along with scanning the user’s devices 12, content 16, and services 18 can also acquire information regarding the user’s profile attached with each of the devices 12 and services 18. This allows for more personalized data 208. This is illustrated in Figure 2.

Figure 3 illustrates an example of a search assistant based on event and location (AI E/L) 138 accessing the user’s calendar 28. The calendar 28 can be stored on a smartphone 12b or other user device 12, or present in memory 104 or on the web-based storage or application 18. The AI E/L search assistant 138 can keep track of calendar events 28a and react as events come closer in time. In an example, the AI E/L search assistant 138 can see that an internal meeting is scheduled with the user’s 10 team. The AI E/L search assistant 138 can then initiate a search by itself or through the search engine 108 for all data 200 related to that particular meeting and return an event/location result 216. Photos, e-mails, relevant documents, internet search results, social media feeds, etc. can be culled automatically. Another example is a student user 10 heading to class. The AI E/L search assistant 138 can pull all of the relevant materials related to the class the user 10 is about to attend.

The AI E/L search assistant 138 can also key a search based on the user’s location based on geo-location data 13a received from the user’s location tracker 13. While an event 28a may not be in the calendar 28, the AI E/L search assistant 138 can access the user’s location data 13a and attempt to determine what event may be at that location or a user’s final destination which may have an event. To extend the above example, the AI E/L search assistant 138 can start to pull the user’s 10 class materials as the user 10 travels to campus, once it determines that the user 10 is in transit.

As an example, Figure 4 illustrates a typical calendar event 28a. The event 28a can include a date 30, time 32, subject 34, location 36, attendees 38 and notes 40. The AI E/L search assistant 138 can use each piece of the event data 30, 32, 34, 36, 38, 40, along with the metadata 202, 204, 206, 208 to determine which search strings are relevant to the event 28a, and then use that to prepare the event/location result 216.

In one example, the event/location result 216 can be presented categorized based on the event data 30, 32, 34, 36, 38, 40 used. Thus, results surrounding the attendees 38 can be separately displayed or linked from the results. Figure 4 illustrates an example of this, the event/location result 216 can list a location result 216-36, in this instance, a link to the traffic pattern around the destination. Attendee results 216-38 and note results 216-40 can be listed and segmented by topic or data.

The AI E/L search assistant 138 can understand where a user 10 may be going based on an initial screen of the user’s metadata 202, 204, 206 208. Pictures can provide lists of possible locations as most digital images include the latitude and longitude embedded in the image. Additionally, a destination entered into a mapping application or an on-line vehicle request application (Uber, Lyft, etc.) can be used as well.

The above initial personal event location search is discussed in detail in co-pending application number 15/950,932 titled“System and Method of AI Assisted Search Based on Events and Location” incorporated herein in its entirety by reference.

Figure 6 illustrates this concept of the time 32 and location 36 of the typical calendar event 28a. The AI E/L search assistant 138 can determine a time interval/distance interval 50 from the event 28a. The time interval/distance interval 50 is illustrated here as a perimeter, but one of skill in the art is aware that the intervals can be stored in any fashion known. As the user approaches the interval 50, either as a time earlier than the event time 32, or the physical distance from the location 36, the AI E/L search assistant 138 can either start the search, update the search, or begin delivering the event/location result 216. In different examples, the AI E/L search assistant 138 can begin a search the moment the user 10 enters the calendar event 28a. The results can be stored, and then delivered at the appropriate interval 50. Depending on the amount of time and or data 200 that has changed/passed since the first search, the AI E/L search assistant 138 can refresh the search at the appropriate interval 50. Alternately, the AI E/L search assistant 138 starts the search and delivers the event/location result 216 once the interval 50 is reached.

The present invention expands the above concept to use search triggers 300 to perform a search, provide a comprehensive report, and a summary of that report. Further, the search results, report and/or summary can be appended or linked to the search trigger 300 and/or content related to the search trigger 300. This allows a user 10 to not only have the relevant information at hand, but that its saved with the content or the triggering event.

The search trigger 300 can, in an example, be an event (e.g., personal and public scheduled and impromptu events), a location (e.g., user and/or event location), generated from user generated data (e.g., media, gaming, social, collaborative and productivity data), physical activity (e.g., steps, exercise, rest and sleep) and/or significant changes in user trend data (e.g., media, gaming, social, collaborative and productivity activity indexes). In one example, a search trigger 300 is not directly inputted as a search by the user 10. For example, while the calendar event is entered by a user 10, the search surrounding that event as it nears is not. A direct input by a user 10 in this context can be entering a search term in a search engine, or the user navigating directly to a website 17 containing the data.

Calendar and location-based examples of search triggers 300 are above to search for personal and public data and return results. The present examples go further than that. One example allows a trigger search assistant 140 to search for a data stamp 302. A data stamp 302 is one or more data points 304 from a single source that can be hyper-relevant to the search trigger 300. The data stamp 302 can be data points 304 collected from both public and private sources. The data stamp 302 can be appended to the search trigger 300 for ease of use and future retrieval.

An example of a data stamp 302, as illustrated in Figure 7, is a photograph as a search trigger 300. A typical digital photograph can be stamped with the time, date and the coordinates of the device taking the photograph (if its GPS enabled). While well known, this is a simple data appendage. While the coordinates are good for identifying where in the world the photograph was taken, it provides no context for the viewer or additional information for the photographer or viewer. An example of the present invention goes much further by providing data stamps 302 with information relevant to both the environment and the user(s) involved.

The trigger search assistant 140 can analyze the search trigger 300 (in this example the digital photo) and go search for relevant information gleaned from the photo (e.g., from website 17). The data points 304« can be additional information regarding the content from the public record. The data points 304 n from the public record data can be contextual to the content, or generically added regardless of content. Public record data can include weather (temperature, wind speed, tides, sun rise/set, etc.), traffic reports, news links, financial market data, etc.

For example, if the photograph is of a sailboat, data points 304 n collected from public weather data can include air and water temperature 304a, 304b, tide information 304c, sunrise and sunset 304d, barometric pressure 304e and all of these points can now be included in a “weather” data stamp 302a. To further the example, if the sailboat is racing in the America’s Cup, other data points 304 n can be yacht specifications 304f, crew names and positions 304g, and current win/losses 304h. These data points 304f-h can now make up the“America’s Cup” data stamp 302b.

The data stamps 302 can also include personal data points 304 n for personalized data stamps 302. Biometric data, calendar entries, links to documents or correspondence, or call logs can all be included. The user’s personal data can include a data stamp 302 having data point 304i of e-mails between her friends regarding the race.

Further, personal data can also mean data points 304 in the public domain relevant to the user. Thus, while the“weather” data stamp 302a can pertain to the weather as depicted in the photograph (search trigger 300) (i.e. the weather at the time the photograph was taken, say in Bermuda) a“user weather” data stamp 302c can be created providing the weather personal to user’s location (e.g. in Tokyo).

Thus, the data stamps 302 can be author centric, recipient centric, and other relevant party centric for all of the data and content above and every time the search trigger 300 is analyzed or delivered to anew user. Data stamps 302 can be created for each or key individuals at meetings or events, providing the user 10 with relevant details (all listed herein) about the people and/or places the user 10 is meeting or heading to.

Turning back to the sailboat picture, the author’s public data can be appended, but so can the recipient. So, while the author’s stamp can reveal a sunny, breezy day, the recipient could be skiing, and the recipient’s weather data stamp 302 may contain freezing temperatures and nor’ easier warnings. This allows the recipient to have context as to where she was when their friend who sent them the picture was sailing. The author can choose to share the public author stamp 302d when the content is shared, either by post or direct communication. The public author stamp is stored with the original content/ search trigger 300. The recipient can data stamp 302e the content upon receipt and their copy of the content can include any shared author stamps and their recipient stamp. If the recipient shares the content, the recipient can choose to share the recipient stamps 302e as well.

Data stamps 302 can also be private data points 304. The author private stamp can record their personal biometrics and other related personal data points 304. The personal data stamps 302 can be encrypted and/or shared on transmittal. Another example is a photograph of an athlete crossing the finish line of a triathlon. He may want to share some personal stamp data regarding his biometrics. The recipient as well may want to stamp their own personal data to the content to give it context, say if they were in the same race and wanted to make a comparison.

Other examples of data stamps 302 can include:

Weather Stamp (API Data);

Environmental Stamp (IoT Sensor Data);

Traffic Stamp (Public Transportation, Road Closures etc.);

Calendar Event Stamp (Personal & Work Calendars);

Public Event Stamp (Events in User’s Location);

Current Event Stamp (Latest News Based on Location);

Time Capsule Stamp (Historical Events Based on Location and Date);

Activity Index Stamp (Media, Gaming, Social, Collaborative and Productivity

Data);

Confidence Level Stamp (Identity, Age, Sentiment, Landmark, Object, etc.);

Insight Stamp (Analysis);

Predictive Stamp (Forecast); and

Alert Stamp (Anomalous Health Sensor Data, Extreme Weather, Current Events at Location).

The data stamps 302 n can be compiled into a full data stack 306, as illustrated in Figure 8. This can be a comprehensive report of all related data points 304 n or the aggregate of multiple data stamps 302 n. The data stack 306 can be broken up and categorized in any number of ways based on the search trigger 300 and/or the user’s 10 personal preferences. For example, the data stack 306 can be divided by personal and public data stamps, author and recipient, or type of data point 304 (e.g. e-mail, video, hyperlink, etc.). The data stack 306 is created to allow the user 10 in-depth access to any one data stamp 302 and the data points 304 below it. Or, the report can be studied as a whole to provide a detailed data set around the search trigger 300.

In addition to a data stack 306, the trigger search assistant 140 can also provide a data snap 308. The data snap 308 is a summary of the data stack 306. While this can be a routine summary (e.g. number and authors of e-mails and video clips) the trigger search assistant 140 can provide additional AI analysis. Furthering the example above, a simple data snap 308 can just state the weather in Bermuda, the win/losses, and the number of e-mails from the user’s friend in New Zealand. However, the additional AI analysis can provide context and natural language results to the data snap 308. For example, the data snap 308 can read“Despite perfect sailing weather, the American team loses to New Zealand in a close race, and your friend Tim from New Zealand is gloating.” Here, the trigger search assistant 140 is analyzing the data points/stamps/stack and putting it all in context and then providing a succinct condensation of the search trigger 300. Additionally, the trigger search assistant 140 can provide predictions given the data point/stamp/stack 304, 302, 306 at hand. Thus, another piece to the data snap 308 can be“New Zealand expected to win, and you will likely lose your bet” by predicting that the next few race days will have similar conditions and that the Americans’ top sailor was just injured, and note the wager the user made in her last e-mail to Tim.

Other data point/stamp/stack 304, 302, 306 can include the search terms (if any) used by the trigger search assistant 140 to find the data points 304. Some data points 304 are through the user’s own data or metadata, subscription services, or other public sources.

The data stamp/stack/snap 302, 306, 308 can be appended automatically to the search trigger 300 by the trigger search assistant 140 or provided separately, even to a distribution list. For the publicly available data, the trigger search assistant 140 can gather data from the relevant repositories and gathering the user’s personal data presumes that the trigger search assistant 140 has access to all of a user’s data and can draw from that pool of information to create data stamp/stack/snap 302, 306, 308. This process provides a comprehensive data set that has volume, variety, velocity and veracity and additionally provides insights and even predictions.

Given the above examples, a distinguishing feature from just appending the date, time, and global position to the photograph is that typical, a majority (if not all) of the data points 304 collected to be turned to a data stamp(s) 302, are being collected from a source separate from the author’s device. Continuing the example, while the sailboat photo can be appended with time, date, and coordinates, that is being reported directly by the author’s camera/smartphone. The weather and race data stamps 302a, 302b were collected separately, for example, from the NOAA and sailing/sports websites and is thus from a third-party repository or device.

Other example distinctions can be that the data stamp 302 can be external to the content itself. Thus, it is not a hyperlink created by the author in a written work. Nor must the data stamp 302 be typical header/footer data appended to all data as it travels a network, including the Internet.

While for the above examples the search trigger 300 is a location and the trigger search assistant 140 can monitor sections of road and notify users 10 of warnings of hazards ahead. In a simple case, when the user 10 is driving down a highway, the trigger search assistant 140 can access public information to provide the user 10 with real-time warnings of crashes or other hazards in route.

Figure 9 illustrates linking a location trigger with external sensor triggers to provide stamps/stacks/snaps 302, 306, 308 regarding a location 36. In this example, the location 36 is a forest and the trigger search assistant 140 can monitor weather data points 304 from a website 17. There are specific conditions that lead to rapidly spreading wildfires and once some of these conditions are met (e.g. is a period of very hot weather that dries out the underbrush of the forest) the trigger search assistant 140 can deliver stamps/stacks/snaps 302, 306, 308 regarding the condition of the location 36. Additionally, the trigger search assistant 140 can add additional data points 304 to provide more relevant stamps 302. Once the weather data points 304 start to lead to wildfire conditions, the trigger search assistant 140 can then add data from or even have deployed drones 42. The drone 42 can capture the conditions of the underbrush and forest in general. The trigger search assistant 140 can also call up satellite 44 information and the satellite 44 information and weather forecasts can be analyzed and synchronized with the information from the drone videos. Lightning during a thunderstorm, careless humans or even arson can ignite the underbrush and spread fire very quickly. Thermal imaging can help identify specific critical areas that are prone for fires. The trigger search assistant 140 can provide snaps 308 (and stamps 302 and stacks 306) with predictive information to provide early identification and give firemen a real chance to fight wildfires before they become large-scale. Billions of dollars in damages and lives can be saved.

Another aspect of the trigger search assistant 140 are slices in time as well as across data content and services 16, 18. The trigger search assistant 140 can autonomously create data time slices of relevant data based on the search trigger 300. A data point, stamp, stack, or snap 304, 302, 306, 308 can have a time dimension and captured for a single user or across a group of users anywhere in the world. A data time slice 310 can take into account the past, present and future of any one data point 304 or data stamp 302. See Figure 10.

The data slice 310 can constitute the most up-to-date and reliable information culled from data points 304 from every source accessed by the trigger search assistant 140. The data slice 310 can be generated in three different time periods. A now slice 312 can contain up-to- date and reliable data points 304. How far back into the past from the user 10 present time for the data to be considered up-to-date is a function of both the data point 304 being analyzed/retrieved and the user 10. For example, if the user 10 is interested in a sporting event, up-to-date data points 304 could be every few minutes or even real time. Any lagging data would be considered unacceptable. However, a slow-paced sport may allow for longer intervals. In contrast, weather data points 304 can be lagged from the user’s present time without the data being considered old or stale. For example, a temperature data point taken at noon is likely just as relevant as a temperature point taken at 12:30 pm or even 1 :00pm. Up-to- the-minute temperature data likely has a larger time window as a now slice 312. Again, this is likely not true of weather data tracking, for example, a tornado, where every minute of advanced warning can save lives.

The other two data slices 310 are taken from the time periods before and after the now slice 312. A past slice 314 can contain historical data in a defined time range that can be set by the user. As noted with the now slice 312, the past slice 314 can be considered the“past” based on the event and user 10. The user 10 can also set how far back the past slice 314 is taken, hours, days, months, years, decades, etc. A past slice 314 can be relevant for example, with stock value information. While heading to meeting, it may be not only interesting in the now slice 312 to know the company’s current trading value, but a historical past slice 314 can let the user know if the stock has been increasing or decreasing over time. This information can be telling of a company’s overall financial health. Also, this past data can be set to trend, based on the search trigger 300, on a past event to see how the company performed after the launch of a new product or the leadership of a particular CEO.

Both the now slice 312 and the past slice 314 can be pulled from data sources (e.g. the internet 14, content 16 and services 18) the trigger search assistant 140 can generate a future slice 316. The future slice 316 can be predictive data that emerges from the culmination of the data points 304 into stamps 302 and stacks 306. The trigger search assistant 140 can sort through the data presented in any one stamp 302 or stack 306 to find a future point. A stamp prediction can be to predict the temperature an hour into the future knowing the current temperature and historical temperatures for the same time. However, a deeper future look can be determined from reviewing the data in a stack 306. The trigger search assistant 140 can review historical weather trends, the effect of those trends on a given commodity (e.g. com, wheat, coffee), and then an estimate on how that effects the businesses where that commodity in a key part of their products.

Figure 11 illustrates an example of a method of autonomous search based on a search trigger. The method can include the step of receiving a search trigger 300 from a user 10 (step 1100). As the system 100 and its engines 102, 106 scan through a user’s data 200 and capture and create metadata 202, 204, 206, 208, the assistants 138, 140 are primed to assist the user 10. When a search trigger 300 is received, trigger search assistant 140 analyzes the search trigger 300 for a search element (step 1102). The search element can be extracted from the trigger 300 or its metadata 202, 204, 206, 208. As the examples above note, the search element can be first a sailboat image, then the search assistant 140 determines it is a boat in the America’s Cup. Other examples above are the calendar event 28a having event data 30, 32, 34, 36, 38, 40 and along with the metadata 202, 204, 206, 208 to determine which search elements are relevant. People, places, times, companies, etc. can all be relevant search elements.

Once the search element(s) are determined, the assistant 140 can search without instructions from the user, for a data point related to the search element (step 1104). Again from examples above, if the search element is places, there can be weather data points can be temperature, wind speed, precipitation, tides, ocean temperature, etc. news data points, traffic data points, etc. All of the points are extracted from searches based on the search element. One the one or more points are collected, they are converted to a data stamp (step 1106). As noted, a data stamp 302 is based around a common topic. A weather stamp can have multiple data points regarding the weather, temperature, wind speed, precipitation, etc. These are all points for the same stamp. All of the weather data may not necessarily be retrieved from the same website/data source. The weather stamp based on the trigger location can be from NOAA (National Oceanic and Atmospheric Administration) and a weather stamp from the user’s location, say in Tokyo, can come from the JMA (Japan Meteorological Agency). Once created, the data stamp 302 is provided to the user 10 (step 1108). The data stamp 302 can be presented to the user 10 is any format appropriate for the information to be conveyed, user’s preference, and any device constraints.

Is the search trigger 300 contains multiple search elements based on at least one of its content, the search assistant’s 140 analysis, or the user’s 10 preferences, multiple data points 304i can be converted into a plurality of data stamps 302i (step 1110). Thus, while the analysis of the search trigger can determine that a weather and America’s Cup stamps are appropriate, the weather stamp may be two stamps, one for the race location and the other for the user’s location. The same holds true for the data time slices 110, each slice of each stamp 304 can be consolidated in one stamp or multiple stamps. However, data stamps 302 having different search elements typically are not consolidated into one stamp (e.g. weather and standings) but are separate and then can be compiled into a data stack 306 (step 1112). Figure 12 further illustrates the search assistant 140 analyzing, without instructions from the user 10, the data stack 306 (step 1114) and then providing a data snap 308 summary of the data stack 306 based on the analysis (step 1116).

Figure 13 illustrates dividing any of the stamps, stacks, or snaps 302, 306, 308 into data time slices 310. The figure illustrates the method for the data stamp 302, but the method can be performed on stacks 306 and snaps 308 as well. The method includes dividing the data stamp 302 into at least two data time slices 310 (step 1300) and then providing the divided data stamp to the user 10 (step 1302). An example of dividing or time slicing a data stamp 302 can include preparing a now slice 312 with an up to date data point (step 1304). An up to date data point can be a data point collected between the time the searching step is performed and a past time based on at least one of the search trigger and the search element. A past slice 314 can also be prepared (step 1306) with a past data point collected at the time of the searching step but the past data point comprises data from the past time or earlier. The past time can be any time after the up to date data point time window back as needed by the search trigger or user setting.

Figure 13 also illustrates preparing a future slice 316 (step 1308). That method can include analyzing, without instructions from the user, at least one of the now slice 312 and the past slice 314 (step 1310). A future data point can be extrapolated from the analysis (step 1312), typically without instructions from the user 10. A future data point is related to, but distinct from, the up to date data point and the past data point. The future data point can be extrapolated using any known techniques. Trend data, market forecasting are all examples, but are shallow comparisons to the analysis that is performed to create the future slice 316.

As discussed throughout, examples of search elements can be at least one of a time, location, word, image, name, and event. Tweets, photos, posts, e-mails, videos, calendar events, news announcements, health data, environmental data, milestone events, etc. all can be the triggers containing these elements that are extracted and acted upon.

The storage/memory 104 is non-transient and can be of the type known to those of skill in the art, e.g., magnetic, solid state or optical. The storage 104 can be centralized in a server or decentralized in a cloud storage configuration. Both the AI E/L search assistant 138 and the trigger search assistant 140 can be hardware (processor), software, programing, or a series of distributed computing elements or routines. Both search assistants 138, 140 may incorporate multiple processes and use other hardware or software elements to accomplish the above described tasks. These computing elements can be disparately located and have different primary functions. However, the present invention harnesses these disparate elements to perform the analysis and search functions.

When the phrase is used“without instructions from the user” that typically means at the time the particular step is taken. Those of skill realize that a user can set initial parameters regarding any of the steps and results. However, once the parameters are set, the system 100 performs certain steps without the user’s explicit intervention or selection. The power of the present invention is the ability to constantly monitor a user’s data stream/lake, both personally and professionally, to sort through the data and to arrive at relevant results at the relevant time. Prior art search engines can be deployed across narrow bands of data (Google for the internet, search terms for a document, etc.) but users then need to know where particular data might be and which search engine to deploy. The difference with the present invention is finding the data and synthesizing it according to the user’s needs without any foreknowledge by the user as to where it is stored or present memory of its existence.

It must also be noted that, as used in the specification and the appended claims, the singular forms“a,”“an” and“the” include plural referents unless the context clearly dictates otherwise. By“comprising” or“containing” or“including” is meant that at least the named component or method step is present in the article or method but does not exclude the presence of other components or method steps, even if the other such components or method steps have the same function as what is named.

It is also understood that the mention of one or more method steps does not preclude the presence of additional method steps or intervening method steps between those steps expressly identified. Similarly, it is also to be understood that the mention of one or more components in a device or system does not preclude the presence of additional components or intervening components between those components expressly identified.

The design and functionality described in this application is intended to be exemplary in nature and is not intended to limit the instant disclosure in any way. Those having ordinary skill in the art will appreciate that the teachings of the disclosure may be implemented in a variety of suitable forms, including those forms disclosed herein and additional forms known to those having ordinary skill in the art.

Certain examples of this technology are described above with reference to flow diagrams. Some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented or may not necessarily need to be performed at all, according to some examples of the disclosure.

While certain examples of this disclosure have been described in connection with what is presently considered to be the most practical and various examples, it is to be understood that this disclosure is not to be limited to the disclosed examples, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

This written description uses examples to disclose certain examples of the technology and also to enable any person skilled in the art to practice certain examples of this technology, including making and using any apparatuses or systems and performing any incorporated methods. The patentable scope of certain examples of the technology is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.




 
Previous Patent: ROUNDED TIP APPLICATOR

Next Patent: SUICIDE GENE