Hi folks! This month brings a bunch of new features, an improvement on one of our sources.

Here’s what we worked on in May:

  • NEW: Fetch data from other collections
  • NEW: Trash
  • NEW: Additional GPT models
  • NEW: AI Agent - Run on Collection Items
  • Improved: AI Agent - Site Scraper

Now to the details!

The Highlight of May

New Feature: Fetch Data From Other Collections

With this feature, automated workflows just got much easier, better to control, and more accurate

What It Does

This feature allows you to search and fetch matching/conflicting data from your collection across other collections using variables.

These variables could be:

  • Data from your collection ( value of {column 1})
  • Alphanumerical values (like text, numbers, etc.)
  • A mix of the above ( text; number; {column 1}; {column 2})

Why It Matters

At first, it didn’t seem so obvious to many. But just think about it for a second. If you can cross-match data with other collections, this opens up an entire universe of new possibilities.

With this feature, you could, for example:

  1. Make sure no prospect receives a message twice by looking them up in other collections first
  2. Create advanced filtering logic to segment and organize your data more effectively
  3. Decrease data enrichment costs by searching through other collections first
  4. Build advanced automation that references data across collections
  5. …. and more

How To Use It

  • Go EnrichAutomation Utils
  • Select Lookup Item in other collection
  • Define your search criteria and run!

New Features and Improvements

New Features

Trash

Your deleted collections are now stored in trash for 7 days before deletion.

You can use the trash to restore deleted collections with 2 mouse clicks!

Why we built it: Sometimes you delete data, but then after 2-3 hours you realize "Damn, I shouldn't have deleted that". This feature is to prevent such situations.

AI Agent - Run on collection items

Now you can run the AI agent as a source from a new collection using variables from other collections.

Why we built it: This is a funny story. We had a prospect asking for this. We built it in a few days. When we got back to them to let them test it, they had found another solution (we don't regret it).

Improvements

Pagination for Site scraper

The AI Agent - Site Scraper can now paginate pages up to 5000.

Why we did it: Makes scraping at scale much easier and enables you to scrape any directory, such as Yellow Pages, Zillow, etc., using Datablist

Related Guides:

Multi Collection Deduplication

When you deduplicate multiple collections, only identical items across all collections will be counted as duplicates. Duplicates within the collection will be skipped.

Why we did this: Enables more controlled cleaning workflows

That’s it for this month!

If you want us to build something for you, pitch me your feature on LinkedIn 👈🏽