Getting Started
Request Workflow
- Submit a Task
Send a
POST
request to/api/v1/scraper/request
. - Handle the Response
- HTTP 200 (Success): Data is returned directly in the response body.
- HTTP 201 (Processing): Use the provided
taskId
to poll for results.
- Poll for Results
For asynchronous tasks, repeatedly call
/api/v1/scraper/result/{taskId}
until data is ready (HTTP 200).
HTTP Status Codes
Code | Meaning | Action |
---|---|---|
200 | Success | Use the response body data directly. |
201 | Task in progress | Poll /result/{taskId} (recommended interval: 1-5 seconds). |
400 | Invalid parameters | Please check if the task parameters are valid. |
429 | Rate limit exceeded | Reduce request frequency or contact support for quota adjustment. |
500 | Internal server error | Retry after 1 minute; contact support if persistent. |
Code Examples
Full Workflow
import requests
import json
import time
API_KEY = "YOUR_API_KEY"
HOST = "api.scrapeless.com"
# Submit task
task_url = f"https://{HOST}/api/v1/scraper/request"
payload = json.dumps({
"actor": "scraper.shopee",
"input": {"url": "https://shopee.tw/a-i.10228173.24803858474"}
})
headers = {'Content-Type': 'application/json', 'x-api-token': API_KEY}
response = requests.post(task_url, headers=headers, data=payload)
# Handle response
if response.status_code == 200:
print("Data:", response.json())
elif response.status_code == 201:
task_id = response.json()["taskId"]
print(f"Task queued. Polling ID: {task_id}")
# Poll for results (max 10 attempts, 3s interval)
max_retries = 10
for _ in range(max_retries):
result_url = f"https://{HOST}/api/v1/scraper/result/{task_id}"
result_response = requests.get(result_url, headers=headers)
if result_response.status_code == 200:
print("Result:", result_response.json())
break
elif result_response.status_code == 201:
print("Still processing. Retrying in 3s...")
time.sleep(3)
else:
print(f"Error {result_response.status_code}: {result_response.text}")
break
else:
print(f"Request failed: {response.status_code} - {response.text}")
cURL (Polling Example)
curl --location --request GET 'https://api.scrapeless.com/api/v1/scraper/result/30681c8b-bfd3-48eb-a7c9-006e40b00591' \
--header 'x-api-token: YOUR_API_KEY' \
--header 'Content-Type: application/json'
Parameters
Parameter | Type | Description |
---|---|---|
actor | string | Scraping service (e.g., scraper.shopee). |
input | object | Task-specific parameters (e.g., action, url). |
proxy | object | Optional proxy configuration with country field. |
Notes
- Polling Recommendations
- Interval: 1-5 seconds.
- Timeout: Set a maximum retry limit (e.g., 10 attempts).
- Debugging Tips
- Test with simple URLs first.