Skip to main content
NEW AI Studio is now available Try it now

Spider works as a standard HTTP proxy. Set proxy.spider.cloud as your proxy address, put your API key in the username, and pass parameters in the password field. Every request goes through the same rendering and stealth pipeline as the REST API. No SDK needed.

What you get

  • Residential, ISP, and mobile proxy pools
  • 199+ countries with per-request geo-routing
  • Tracker and ad blocking on by default
  • Works with curl, requests, axios, or any HTTP client
  • ISP pool pushes up to 10 GB/s throughput
  • Compatible with Crawl, Scrape, Screenshot, Search, Links
  • Automatic IP rotation and session management

Connection

HTTP

proxy.spider.cloud:80

HTTPS

proxy.spider.cloud:443

Username

YOUR-API-KEY

Password

PARAMETERS

Proxy pools

Set the proxy parameter to pick a pool. Use country_code to geo-route.

residentialReal-user IPs, 199+ countries, up to 1 GB/s$2/GB
ispISP-grade routing, highest throughput, up to 10 GB/s$1/GB
mobile4G/5G device IPs for heavily protected targets$2/GB
Proxy TypePriceMultiplierDescription
residential$2.00/GB×2-x4Entry-level residential pool
mobile$2.00/GB×24G/5G mobile proxies for stealth
isp$1.00/GB×1ISP-grade residential routing

Tracker and ad blocking

Requests through the proxy automatically skip known analytics scripts, ad-network callbacks, and malicious domains. This is on by default. If you need the raw, unfiltered response (for example, ad verification or full-page archiving), turn it off per request.

trackers_disabledBlocks trackers and ad networks. Defaults to true. Set false to allow all traffic through.

Password field example: YOUR-API-KEY:trackers_disabled=false

The proxy front-end works with Crawl, Scrape, Screenshot, Search, and Links. See all proxy locations.

Python example
import requests, os


# Proxy configuration
proxies = {
    'http': f"http://{os.getenv('SPIDER_API_KEY')}:proxy=residential@proxy.spider.cloud:8888",
    'https': f"https://{os.getenv('SPIDER_API_KEY')}:proxy=residential@proxy.spider.cloud:8889"
}

# Function to make a request through the proxy
def get_via_proxy(url):
    try:
        response = requests.get(url, proxies=proxies)
        response.raise_for_status()
        print('Response HTTP Status Code: ', response.status_code)
        print('Response HTTP Response Body: ', response.content)
        return response.text
    except requests.exceptions.RequestException as e:
        print(f"Error: {e}")
        return None

# Example usage
if __name__ == "__main__":
     get_via_proxy("https://www.example.com")
     get_via_proxy("https://www.example.com/community")