IntegrationsPipedreamGetting Started

Getting Started

In this guide, I’ll walk you through how to set up Scrapeless with Pipedream. With just a few basic configurations, you can automatically collect and process data scraped from any search engine.

First, you’ll need to complete the following setup:

  1. Scrapeless API Key: Please sign up for a Scrapeless account. Once registered, you can access your API key from the dashboard.

  1. Pipedream Account: Create a Pipedream account.

Set Up Your Scrapeless API Key in Pipedream

You need to log in to your Scrapeless account and go to the dashboard to obtain your API key. Once you have it, go to the “Accounts” tab in Pipedream and add the key there, as shown below:

After that, set your API key like this:


Make Your First API Request

We’ll use Pipedream’s Scrapeless integration to fetch data from the Scrapeless API. First, create a new workflow. On the Pipedream dashboard, click the “New Workflow” button. This will open a new workflow creation window:


Step 1. Set Up the Trigger Step

Choose whether you want to trigger the workflow manually or automatically using a webhook, HTTP request, or any other available trigger.

For the purpose of this tutorial, we’ll use a Scheduled Trigger to start the workflow. You can adjust this later to use other triggers, such as manual execution or event-driven actions.

It should look like this:

After saving, you’ll be able to use this trigger in your workflow. Now we can proceed to configure the action step.

Step 2. Set Up the Action Step

  1. Now, let’s add an action step to the Pipedream workflow. From the Actions dropdown menu, select Scrapeless:

These are the possible actions:

  1. Next, add the account where you configured your API key to this procedure:

Once completed, you can add all other parameters and test the query. Just like the steps above, you’ve now set up your first request in Pipedream.

Next, after configuring it based on your use case, you’ll be ready to deploy it.

How to Create a Knowledge Graph Intelligent Crawling System

Prerequisites

  • You have registered on Scrapeless and obtained an API token.
  • You have a Discord Webhook URL (for sending notifications).

Step 1: Add a Trigger

  • Type: Schedule
  • Trigger Time: Every day at 08:00 (UTC)
  • Method: Use either Cron or a fixed time interval


Step 2: Configure Parameters

Component:

  • google-search

Parameter Settings:

  • query: coffee
  • gl: us (optional)
  • hl: en (optional)


Step 3: Extract Information

This step extracts the knowledge panel information from the keyword data obtained in the previous step.

Add a Node.js code step with the following code:

export default defineComponent({
  async run({ steps }) {
    const searchResult = steps.scrape_google?.$return_value;
    const webResults = searchResult?.knowledge_graph.web_results
 
    if (!searchResult || !webResults) {
      throw new Error("❌ No valid results returned from Scrapeless Google Search.");
    }
 
    const links = webResults.map((item) => item.link)
 
    return {
      links,
    };
  }
});

Step 4: Send Discord Notification

This step sends the information obtained in Step 3 to your specified Discord channel.

Add a Node.js step with the following example code:

import { axios } from "@pipedream/platform";
 
export default defineComponent({
  async run({ steps, $ }) {
    const results = steps.extract_coffee_search_results.$return_value?.links || []
    console.log("result", results)
    const sendLinks = results.slice(0, 5)
 
    if (!sendLinks || sendLinks.length === 0) {
      console.log("✅ No search results to notify.");
      return { status: "no_results" };
    }
 
    const webhookUrl = "https://discord.com/api/webhooks/1381829187223949404/mweRKdQfJmA5OskoSZ0V_IApucOrMK7AHxN4YaAvjE3SRzp1xnbK4SFZLvMYjwnIFy1V"; // 🟡 Please replace it with your webhook
 
    const lines = sendLinks.map(r => `📌 ${r}`).join("\n\n");
 
    const message = {
      content: `📡 **Keyword Monitor: "coffee"**\n\n${lines}\n\n⏰ Detected at: ${new Date().toLocaleString()}`
    };
 
    try {
      const res = await axios($, {
        method: "POST",
        url: webhookUrl,
        headers: {
          "Content-Type": "application/json"
        },
        data: message,
      });
 
      return { status: "sent", res };
    } catch (err) {
      console.error("❌ Discord webhook failed:", err);
      return { status: "error", error: err.message };
    }
  },
});

Replace the URL in this line with your own Discord Webhook:

const webhookUrl = "https://discord.com/api/webhooks/your_webhook_id/your_webhook_token";

If you don’t have a Webhook yet, you can create one in Discord as follows:

  1. Open the channel where you want to send notifications.
  2. Click Channel Settings > Integrations > Webhooks.
  3. Create a new Webhook and copy its URL to use as the webhookUrl mentioned above.

Preview (Message Sent)

When the Jasper.ai page changes, you will receive a message like this in Discord: