By Andrew Dugan
Senior AI Technical Content Creator II

Like many of us, I’ve felt overwhelmed with the constant notifications and information overload inherent in my daily life from social media, the news, emails, personal messages, etc. Can AI be the tool to help us unplug from our busy tech-focused lives? It may seem paradoxical, but yes it can. AI is an excellent tool for filtering out the full firehose of information that we are sprayed with on a daily basis. AI can act as a filter, condensing all of that information down to only what is necessary for us to live productive lives.
I demonstrated this with a new tool, made possible with Sonnet 4.6, called the Daily Digest. The Daily Digest is a single compilation of all my notifications, news, to-do list, schedule, email, weather, traffic, etc. into a single update that is tailored specifically to me. In this tutorial, I will cover how it works and how Claude Sonnet 4.6 made it possible.
I’ve been wanting to make this tool for a while now, and Sonnet 4.6’s recent release showed me that this was the right moment to do so. Sonnet 4.6 is a significant improvement over Sonnet 4.5. Some even report it’s comparable to Opus 4.5. It has improved reasoning over a long context, and adapts its reasoning effort based on the complexity of the task, ideally leading to lower token usage. Identifying relevant information and making tailored inferences over a potentially unlimited data pool sourced from news articles, emails, to-do lists, and eventually social media posts requires a higher level of reasoning logic. Sonnet 4.6 is a good place to start.
Read about Sonnet 4.6’s additional features that have been added, including improved computer use and coding capabilities.
For third-party API services, the Daily Digest currently uses the Google API Gmail client for checking email and calendar events. It uses the Google Routes API for checking traffic for daily driving, the NewsAPI.org API for news, the openweathermap.org API for updated weather reports, and the Todoist API for my to-do list. More APIs can be added into the workflow seamlessly, but these cover the basics and do not risk oversharing personal data with questionable services.
A lot can be done with a single Sonnet 4.6 prompt that takes all of the API response data and creates a daily summary. It can organize the priorities, filter out relevant news, and infer whether information from one API response will have an effect on information from another response. The limitations of a single prompt mostly arise when you want to make an API call with specific information that needs to be parsed out of your data. For example, to know the weather and traffic, you need to know which locations are relevant for making the weather API calls and which origin/destination locations are needed for the traffic API calls.
This can be handled in a few different ways. One option is to define the tools in the requests, and allow Claude to loop through the requests, logic, tool calls, and responses. Another option is to create Agent Skills for each of these two tasks and give Claude permission to run them when it identifies a need for a weather or traffic report for a given location. These methods are probably the most consistent and accurate options, especially with the advanced understanding of Sonnet 4.6. The trade-off is that the token cost is likely to be higher as Claude reasons through what is necessary, makes the calls, and processes the responses in a loop until all of the requirements are satisfied.
Another, lower-cost, option is to have your application parse out the locations and routes that might be important for your Daily Digest, have your script make the API calls, and put the response data into the context that Sonnet 4.6 uses to generate the Daily Digest. This has the potential to use fewer tokens because the parsing functionality can use one (or multiple) smaller Large Language Model (LLM) or just one additional call to Sonnet 4.6. It can introduce inconsistencies, though, because it’s more difficult to communicate the contextual nuances between the logic that parses the relevant data and the Sonnet 4.6 call that summarizes it. For example, if you have a calendar event to take a trip to a neighboring town, then another calendar event to take a trip to Egypt, your logic that parses the origin/destination locations for your trip might try to calculate traffic estimates from your location to your neighboring town and traffic estimates from your location to Egypt.
To limit these mistakes, it’s important to be explicit about the reasons and context in the prompts for parsing the location data, whether it’s for traffic today or for weather next week.
All of this came together to create an API endpoint that, when called, returns a full summary of emails that need attention, calendar events for the day (and how to prepare for them), a few tailored news items, a list of relevant weather and traffic updates, and a prioritized list of things to focus on.
Exactly what is returned, what should be inferred from the information, and what format it should be delivered in is completely customizable through the Sonnet 4.6 prompts. This has required some manual tuning and testing through iteration to make it better for my needs.
I run it locally and have a Python cron job trigger it once a day in the morning. It could be deployed on a DigitalOcean Droplet, called from a separate platform, and delivered to you via any channel you’d like. A good option might be to use Twilio to send yourself an SMS or WhatsApp message.
In the future, I’d like to add some social media connections to alert me when family or close friends post relevant content. This is a bigger challenge because many social media applications actively try to prevent this kind of automated monitoring.
If you are interested in using this project or building a similar one, you should create and use a separate Google account, rather than giving it access to your full calendar/Gmail. Then you can selectively forward emails or share your calendar events with the new Google Account in order to have stricter control over what you’re granting the AI access to.
If you want to run this kind of application on a remote machine, a good architecture is to set it up as an API on a DigitalOcean Droplet, then have a cron job run as a separate service to make requests to it and send the responses to whatever message delivering channel you prefer. Then you can set up standard API security infrastructure to make sure you are the only one with permission to access it.
Can we use other models from OpenAI or open source options?
Yes, it is definitely possible, but the results will vary. Sonnet 4.6 is recommended for its advanced reasoning and efficiency.
Can I install this myself?
Yes. Instructions for installation are in the GitHub repo. Google Auth can be challenging to set up. Other than Google Auth, it just requires setting up API keys for the Google Routes API, Weather API, Todoist API, News API, and any other services you would like to integrate.
Is it secure?
It’s as secure as you make it. You should be conscious of the APIs that you are sharing your data with, and make sure you limit who has access to make requests to the service.
Can you have it respond to emails and notifications for you, like OpenClaw or other tools?
Yes, this functionality could be added, but the goal was to filter information rather than to handle tasks independently.
Leveraging Sonnet 4.6, the Daily Digest application brings together your emails, calendar events, news, weather, traffic, and to-do list into a single, actionable summary. By integrating multiple APIs and leveraging advanced AI reasoning, it reduces information overload and allows you to focus on what matters most each day.
The Daily Digest is fully customizable. You can expand it with new integrations, adjust the summary format, and tailor the delivery method to your workflow. Whether you run it locally or deploy it remotely, using a dedicated Google account and following best practices for API security will help keep your data safe.
With this foundation, you can continue to experiment with new data sources, delivery channels, and automation features to further streamline your daily routine and boost your productivity.
Thanks for learning with the DigitalOcean Community. Check out our offerings for compute, storage, networking, and managed databases.
Andrew is an NLP Scientist with 8 years of experience designing and deploying enterprise AI applications and language processing systems.
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
Reach out to our team for assistance with GPU Droplets, 1-click LLM models, AI Agents, and bare metal GPUs.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Full documentation for every DigitalOcean product.
The Wave has everything you need to know about building a business, from raising funding to marketing your product.
Stay up to date by signing up for DigitalOcean’s Infrastructure as a Newsletter.
New accounts only. By submitting your email you agree to our Privacy Policy
Scale up as you grow — whether you're running one virtual machine or ten thousand.
Sign up and get $200 in credit for your first 60 days with DigitalOcean.*
*This promotional offer applies to new accounts only.