Latenode

Automatically extract LinkedIn profiles and sync to Google Sheets

This Latenode automation helps users extract data from LinkedIn profiles and save it directly to a Google Sheets spreadsheet. By leveraging the Apify web scraping platform, the workflow can automatically fetch profile details such as name, company, and other relevant information.

Users can then analyze this data in the organized Google Sheets format, enabling tasks like lead generation, recruitment research, and market analysis. The automation streamlines the process of gathering targeted LinkedIn profile data and centralizing it for further use, saving time and effort compared to manual scraping and data entry.

Updated Apr 2, 2026Est. run: 10sEst. cost: $0.0006
How Latenode estimates time and cost

Latenode bills workflow runs in credits: 1 credit = 30 seconds of processing. Minimum charge per run depends on your plan. Plug-and-Play (PnP) AI nodes are billed separately—each PnP token is $1 USD, charged pay-as-you-go at vendor cost plus a small processing fee, with no API keys required.

Full pricing — how credits work →
Scraping & data collection

Workflow preview

What this template does

  • Extracts LinkedIn profile data like name, company, and other details using Apify web scraper
  • Saves extracted data directly into a Google Sheets spreadsheet for further analysis
  • Enables tasks like lead generation, recruitment research, and market analysis with organized profile data
  • Automates the process of gathering targeted LinkedIn profile information, saving time and effort
  • Centralizes LinkedIn profile data in a structured Google Sheets format for easy access and use

How it works

1
Trigger

Read LinkedIn URLs from Google Sheet

The workflow starts by reading a list of LinkedIn profile URLs from a Google Sheets spreadsheet. This step extracts the URLs that will be used to fetch the profile data.

2
Action

Fetch Profile Data from LinkedIn

Using the LinkedIn profile URLs, the workflow makes HTTP requests to the Apify API to retrieve the relevant data from each LinkedIn profile, such as the name, company, and other details.

3
Action

Run Apify LinkedIn Scraper

The Apify LinkedIn scraper actor is executed to process the profile URLs and extract the desired data. This step utilizes the web scraping capabilities of Apify to automate the data extraction from LinkedIn.

4
Action

Save Data to Google Sheets

The extracted LinkedIn profile data is then stored in a Google Sheets spreadsheet, allowing users to easily analyze and organize the information for tasks like lead generation, recruitment research, and market analysis.

Setup guide

1

Add Apify Credential

1. In the Latenode Credentials panel, add an Apify credential by providing your Apify API token. 2. Name the credential for easy reference, e.g. 'Apify Scraper'.

2

Configure Google Sheets Connection

1. In the Latenode Credentials panel, add a Google Sheets credential by authorizing Latenode to access your Google account. 2. Name the credential for easy reference, e.g. 'Google Sheets Storage'.

3

Set up LinkedIn Profile Scraper

1. Add a Latenode Headless Browser node to your workflow. 2. In the node settings, configure the browser session to handle any necessary LinkedIn login or authentication requirements. 3. Connect the Headless Browser node to an Apify node. 4. In the Apify node settings, select the Apify credential you added earlier and choose the 'LinkedIn Profile Scraper' actor to execute.

4

Map Extracted Data to Google Sheets

1. Add a Google Sheets node to your workflow. 2. In the node settings, select the Google Sheets credential you added earlier. 3. Map the extracted LinkedIn profile data fields (e.g. name, company, profile URL) to the corresponding Google Sheets columns.

5

Customize Scraping Parameters

1. In the Apify node settings, configure any desired scraping parameters such as the number of profiles to fetch, data fields to extract, or filtering options. 2. Optionally, add a Latenode Data Mapper node before the Google Sheets node to transform or enrich the extracted data as needed.

Requirements

Connect your LinkedIn account to the Apify platform to enable scraping of LinkedIn profiles
Create a Google Sheets spreadsheet to store the extracted LinkedIn profile data
Authorize the Latenode workflow to access and write data to your Google Sheets spreadsheet
Configure the Apify scraper within the Latenode workspace to fetch the desired LinkedIn profile data

FAQ

Common questions about this template

The LinkedIn scraper automatically extracts profile information such as name, company, job title, and other relevant details from LinkedIn profiles. This data is then saved directly to a Google Sheets spreadsheet for further analysis and use.

More templates

You might also like

Browse all templates →
Scraping & data collection

Automatically Sync Google Maps Business Data to a Spreadsheet

This automation workflow allows users to efficiently scrape business data from Google Maps, including names, contact details, and reviews, and export the structured information into a spreadsheet or database. The workflow uses the SerpAPI service to retrieve Google Maps search results, which are then transformed and appended to a Google Sheet. This enables users to generate leads, conduct market analysis, and gain valuable insights from the collected data in a cost-effective and scalable manner.

26s$0.0703
Scraping & data collection

Scrape Zillow property data and sync to Google Sheets automatically

This Latenode automation extracts real estate listing details from Zillow and automatically populates a Google Sheets spreadsheet with the property data. It leverages the Scrape.do web scraping API to bypass anti-bot protections and fetch the full HTML of Zillow listings, then parses key information like price, address, days on Zillow, and Zestimate, and saves the structured results into a Google Sheet. This solution is designed for real estate professionals, investors, and market analysts who need to collect property data at scale without manual effort, enabling better market research, portfolio tracking, and lead generation.

9s$0.0006
Scraping & data collection

Automatically scrape and store Booking.com hotel data

This workflow automates the search and extraction of hotel data from Booking.com, triggered by a chat message. It uses a combination of web scraping with Bright Data's Web Scraper and AI-powered data processing with OpenRouter to deliver a concise, human-friendly list of hotels, including the title, address, original price, and final price. The final output is a clean and formatted report, making it a valuable tool for travelers, event planners, and business professionals who need to quickly find accommodation.

26s$0.0703