From opening dozens of tabs to a ready-to-go board

Por Albert Pérez

Automating news scouting with a flow that captures headlines and delivers them into a ready-made Trello board

How we automated news scouting

Does this scene ring a bell?

Every morning someone on the team opens a pile of tabs: institutional sites, blogs, local media… They skim headlines, copy links, check whether they were already shared, decide what deserves to become a task. It is necessary work, yet repetitive, error-prone and nearly impossible to scale.

For us the question became:
“Can we free people from this manual watch and let them focus on analysing and deciding, not on collecting?”

The starting point: a routine that burns time

Our situation looked exactly like this:

  • Multiple sites and sources, some with RSS and some without it.
  • Manual review every single day.
  • Risk of forgetting sources, duplicating stories or reacting too late.

The outcome: many hours invested in “searching” and very few in interpreting and acting.


The goal: an “assistant” that prepares the working board

We framed a very simple objective:

Every morning, without anyone lifting a finger, we want a list of fresh headlines already waiting in a Trello board, ready to prioritise and assign.

Practically speaking, that means:

  • Aggregating headlines from multiple sources.
  • Avoiding duplicates (so the same story does not pop up two or three times).
  • Keeping only the essentials: title, link, date and origin.
  • Leaving everything in a work-friendly format (cards/tasks).

No need to download full pages or deal with entire bodies of text: just useful metadata.


How the automated flow works

Picture a very disciplined “digital assistant” that, every day:

  1. Visits the predefined sources (sites, blogs, RSS…).
  2. Checks what is new compared with the previous run.
  3. Verifies whether a story was already captured to avoid duplicates.
  4. Creates a card in a board (Trello in our case) with the basic info.
  5. Stores a historical log in a spreadsheet so we can audit and analyse trends.
  6. If something breaks (for example, a site changes its layout), it notifies the team on Slack so it doesn’t go unnoticed.

The result: the team no longer “hunts” across the internet — they open the board and immediately decide what to do with each headline.


What changed in our day to day

Numbers will vary by organisation, yet the impact showed up fast for us:

  • We went from spending 60–90 minutes a day reviewing sources…
  • …to investing 10–15 minutes reviewing a board that is already prepared.
  • Human errors dropped (forgetting sources, repeating stories, missing something relevant).
  • Prioritisation improved: everything arrives with the same format and quality.
  • We gained a historical dataset to spot trends: publishing spikes, frequency by source, and more.

In short: less operational effort, more analytical capacity.


Challenges we had to solve

Along the way we faced several hurdles that may help if you plan something similar:

  • Very different sources: some sites are static, others change often, some have RSS, others don’t.
  • Silent layout changes: if a site redesigns itself, the system might stop “understanding” it, so we added clear alerts when that happens.
  • Dates and formats all over the place: normalising that metadata was essential.
  • Duplicate prevention: sometimes a story tweaks its URL; the system still has to recognise it.
  • Being respectful with the sites we visit: this is not about hammering them with thousands of requests per minute, but about acting responsibly.

Design choices that keep it maintainable

Beyond the technical bits, some design decisions made the difference:

  • A single data model for every source: all stories are stored with the same fields, which simplifies analysis and future integrations.
  • Adding new sources without rewrites: each source follows a clear “contract”, so scaling up does not break everything.
  • Actionable alerts: whenever something fails, the message points to the source, the issue and where to start looking.

A quick technical snapshot

For the tech-curious, here is the short version:

  • We built it with Python, using small modules that know how to read each source type.
  • The automated run always follows the same steps: extract, validate, normalise, deduplicate and create tasks.
  • We use lightweight integrations with Trello (cards), Google Sheets (historical log) and Slack (alerts).
  • It runs automatically every day in a reproducible environment.

For the end user, the only thing that matters is that it behaves reliably and predictably, not the implementation details.


What could come next?

Now that the foundation is ready, a few ideas emerge:

  • Use AI to provide a short daily summary of the captured stories.
  • Use AI-driven filters to highlight key headlines or discard irrelevant ones based on predefined criteria.

Conclusion

The real value of this project is not the code — it is the shift in focus:

Experts stop spending hours “watching” sources and can devote their time to what brings the most value: interpreting, deciding and executing.

An automated flow turns scattered sites and channels into a structured stream of actionable tasks.
It is the difference between reacting late to the news and finding a ready-to-go board every morning.


Want to learn more?

If your organisation is also spending too many hours monitoring sites and newsletters, let’s talk.

I will share a visual outline of the architecture and the steps we follow to automate this flow, tailored to your context.

#Automation #InformationManagement #Productivity #DigitalTransformation #News #Communication #DataDriven