Latenode

Personalized cold email icebreakers with website scraping and AI

This automation workflow creates an advanced AI-powered cold email personalization system that generates deeply personalized multi-line icebreakers to improve cold email reply rates.

The system scrapes website data for each prospect, analyzes multiple pages using GPT-4 to summarize the content, and then generates custom email openers that make recipients believe the sender has significant knowledge about their business and interests. This allows the user to personalize their cold email campaigns and achieve higher engagement rates. The workflow integrates with Apify for web scraping, OpenAI GPT-4 for content summarization and icebreaker generation, and Google Sheets to store the prospect data and generated icebreakers.

Updated Apr 6, 2026Est. run: 26sEst. cost: $0.0703
How Latenode estimates time and cost

Latenode bills workflow runs in credits: 1 credit = 30 seconds of processing. Minimum charge per run depends on your plan. Plug-and-Play (PnP) AI nodes are billed separately—each PnP token is $1 USD, charged pay-as-you-go at vendor cost plus a small processing fee, with no API keys required.

Full pricing — how credits work →
Scraping & data collection

Workflow preview

What this template does

  • Extracts relevant website content using Apify web scraper
  • Analyzes scraped data using GPT-4 to summarize key points
  • Generates personalized cold email icebreakers based on the summaries
  • Stores the icebreakers in a Google Sheets spreadsheet for easy access
  • Enables users to personalize cold email campaigns and boost reply rates

How it works

1
Trigger

Get list of prospect website URLs

The workflow starts by fetching a list of prospect website URLs from the Apify service.

2
Logic

Remove duplicate URLs

The list of website URLs is then de-duplicated using a JavaScript integration to ensure each prospect is only processed once.

3
Action

Extract links from website HTML

For each website URL, the workflow extracts all the links on the page using an HTML integration.

4
Logic

Filter internal links

The extracted links are then filtered to keep only the internal links (starting with /) that point to pages within the prospect's website.

5
Action

Fetch web page content

The workflow then fetches the HTML content for each of the filtered internal links using an HTTP request integration.

6
AI

Summarize web page content

The HTML content for each web page is analyzed using the GPT-4 model from the OpenAI integration, which generates a summary of the page's content.

7
Logic

Limit web pages

To keep the data manageable, the workflow limits the number of web page summaries to a maximum of 3 per prospect.

8
Action

Fetch home page

In addition to the web page summaries, the workflow also fetches the homepage content for each prospect using the Apify integration.

9
AI

Aggregate web page summaries

The workflow then aggregates all the web page summaries for each prospect, combining them into a single representation of the prospect's online presence.

10
AI

Generate personalized icebreaker

Finally, the workflow uses the GPT-4 model to generate a multi-line personalized icebreaker for each prospect based on the aggregated web page summaries and homepage content.

11
Action

Store prospect data

The prospect data, including the website URLs and the generated personalized icebreakers, are then stored in a Google Sheet using the Google Sheets integration.

Setup guide

1

Add Apify Credential

1. In the Latenode Credentials panel, add a new credential for Apify. 2. Enter your Apify API key.

2

Configure OpenAI GPT-4 Credential

1. In the Latenode Credentials panel, add a new credential for OpenAI. 2. Enter the API key or model ID for the GPT-4 model you want to use.

3

Set up Google Sheets Connection

1. In the Latenode Credentials panel, add a new credential for Google Sheets. 2. Authenticate your Google account and grant the necessary permissions.

4

Configure Apify Node

1. Add an Apify node to your workflow. 2. Select the Apify credential you created earlier. 3. Configure the actor, URLs to scrape, and any other limits or options.

5

Set up OpenAI GPT-4 Node

1. Add an OpenAI GPT-4 node to your workflow. 2. Select the OpenAI credential you created earlier. 3. Configure the prompts and parameters for generating the personalized icebreakers.

Requirements

Apify account and API key to enable web scraping of target websites
OpenAI GPT-4 API key to access the content summarization and icebreaker generation capabilities
Google Sheets integration to store the prospect data and generated icebreakers
Inside the Latenode workspace, configure the 'headless-browser' and 'javascript' nodes to scrape website content

FAQ

Common questions about this template

Each run uses credits on your Latenode plan. We charge for processing time (1 credit = 30 seconds). Your actual cost depends on your plan and how long the run takes. See pricing plans for plans and how credits work.

More templates

You might also like

Browse all templates →
Scraping & data collection

Automatically Sync Google Maps Business Data to a Spreadsheet

This automation workflow allows users to efficiently scrape business data from Google Maps, including names, contact details, and reviews, and export the structured information into a spreadsheet or database. The workflow uses the SerpAPI service to retrieve Google Maps search results, which are then transformed and appended to a Google Sheet. This enables users to generate leads, conduct market analysis, and gain valuable insights from the collected data in a cost-effective and scalable manner.

26s$0.0703
Scraping & data collection

Scrape Zillow property data and sync to Google Sheets automatically

This Latenode automation extracts real estate listing details from Zillow and automatically populates a Google Sheets spreadsheet with the property data. It leverages the Scrape.do web scraping API to bypass anti-bot protections and fetch the full HTML of Zillow listings, then parses key information like price, address, days on Zillow, and Zestimate, and saves the structured results into a Google Sheet. This solution is designed for real estate professionals, investors, and market analysts who need to collect property data at scale without manual effort, enabling better market research, portfolio tracking, and lead generation.

9s$0.0006
Scraping & data collection

Automatically scrape and store Booking.com hotel data

This workflow automates the search and extraction of hotel data from Booking.com, triggered by a chat message. It uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels, including the title, address, original price, and final price. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation.

26s$0.0703