Your First Plan is on Us!

Get 100% of your first residential proxy purchase back as wallet balance, up to $900.

Start now
EN
English
简体中文
Log inGet started for free

Blog

cURL GET Request Guide: Syntax, Arguments & Examples

cURL GET request command line syntax and arguments example
Kael Odin - Senior Technical Copywriter at Thordata specializing in cURL documentation
Kael Odin
Last updated on
2025-12-12
12 min read
Engineering Team Reviewed
Code Tested: Dec 2025
cURL Version: 8.x Compatible
📋 Key Takeaways
  • Default cURL GET requests are often blocked by anti-bot systems due to generic User-Agent strings.
  • Mastering cURL arguments like -G, -L, and -A is essential for handling query parameters and redirects properly.
  • TLS fingerprinting (JA3) can detect cURL even with proxies—production scraping may require browser impersonation.
  • Use -v (verbose) for debugging connection issues and viewing raw HTTP headers.
  • For JavaScript-heavy sites, standard cURL requests won’t work; use headless browsers or an API with JS rendering.

cURL (Client for URLs) is the Swiss Army knife of data engineering. While most developers know how to run a basic curl google.com, relying on default commands in a production scraping environment is a recipe for getting blocked.

Modern websites employ sophisticated defenses—like TLS fingerprinting (JA3) and behavior analysis—that can instantly detect standard cURL GET requests. In this advanced guide, we move beyond the basics. We will cover the essential cURL arguments, how to structure production-ready requests, how to debug connection drops, and how to integrate Residential Proxies to stay anonymous.

1. The Anatomy of a cURL GET Request

A GET request is the standard HTTP method for retrieving data without modifying server-side resources. However, to a server’s anti-bot system, a naked cURL request looks suspicious because it lacks the “metadata” that a real browser sends.

Here is the simplest command possible (which typically gets blocked by protected sites like Amazon or Google):

1
curl http://httpbin.org/get
✓ 200 OK Response Time: 245ms
{ “args”: {}, “headers”: { “Host”: “httpbin.org”, “User-Agent”: “curl/8.4.0” }, “origin”: “203.0.113.45”, “url”: “http://httpbin.org/get” }
Pro Tip: The User-Agent Trap By default, cURL sends User-Agent: curl/8.4.0. This is an immediate red flag for anti-bot systems. According to our testing, over 85% of protected sites block requests with default cURL User-Agents within the first request. You must always spoof this to look like a Chrome or Firefox browser using the -A or -H argument.

2. Essential cURL Arguments & Flags

When building a production scraper, you need more than just the URL. Here are the cURL arguments that separate amateur scripts from production-grade tooling.

A. Handling Query Parameters (-G)

Don’t concatenate strings manually (e.g., ?q=item). Use -G and -d to let cURL handle URL encoding automatically. This prevents errors with special characters like spaces, ampersands, and Unicode.

1
curl -G -d "search=iphone 15" -d "sort=price_asc" http://httpbin.org/get

B. Following Redirects (-L) and Max Redirs

Scraping targets often redirect from HTTP to HTTPS, or from /product to /auth/login. Use -L to follow them, but set a limit to avoid infinite redirect loops that can hang your script.

1
curl -L --max-redirs 5 http://httpbin.org/redirect-to?url=http://httpbin.org/get

C. Debugging Like a Pro (-v vs. –trace)

If you get a 403 error, -v (verbose) shows the request and response headers. But if the connection hangs or drops unexpectedly, you need to see the raw data transmission at the byte level.

1
2
3
4
5
# Verbose mode - shows headers and connection info
curl -v http://example.com

# Trace mode - shows raw bytes (use for deep debugging)
curl --trace-ascii debug_dump.txt http://example.com

3. The “Hidden” Danger: TLS Fingerprinting

This is a concept often missed in basic tutorials. Even if you rotate your IP address perfectly, advanced protection systems like Cloudflare can still block you because cURL has a unique TLS handshake signature (JA3 fingerprint).

Standard browsers negotiate encryption cipher suites in a specific order. cURL does it differently—and this difference is detectable within milliseconds. If a site uses TLS fingerprinting, raw cURL will fail regardless of your proxy configuration.

🔬 Real-World Observation: JA3 Detection
Based on internal testing against 50 Cloudflare-protected domains

When testing standard cURL (v8.4.0) against Cloudflare-protected sites, we observed that requests were challenged or blocked within 1-3 requests, even when using residential IPs with clean reputation scores. The blocking occurred before any rate limiting could be triggered, indicating fingerprint-based detection.

In these cases, you need a solution that impersonates a real browser’s TLS signature, which is a core feature of the Thordata Web Scraping API.

4. Production Script: Rotating Proxies with cURL

To scrape at scale, you cannot use your local IP address. You must route traffic through a proxy pool that automatically rotates IPs. Thordata makes this straightforward with a single gateway endpoint that handles rotation automatically.

Scenario: You want to verify IP rotation is working by making multiple requests and checking the returned IP address.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
#!/bin/bash

# Configuration
# Security: Export your token in .bashrc instead of hardcoding
PROXY="http://$THORDATA_SCRAPER_TOKEN:@gate.thordata.com:22225"
TARGET="http://httpbin.org/ip"

echo "🔄 Starting Proxy Rotation Test..."

for i in {1..3}
do
   echo "Request #$i..."
   # -x: Specify proxy server
   # --connect-timeout: Fail fast if proxy is unresponsive
   # -s: Silent mode (suppress progress bar)
   curl -x "$PROXY" \
        --connect-timeout 5 \
        --retry 2 \
        -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/120.0.0.0" \
        "$TARGET"
   echo -e "\n----------------"
done
Security Best Practice Never hardcode API tokens or proxy credentials directly in scripts. Use environment variables (export THORDATA_SCRAPER_TOKEN=your_token) or secure secret management systems for production deployments.

5. cURL vs. The World: When to Upgrade?

Is cURL always the right tool? No. As data engineers, we must select the appropriate tool based on target complexity and project requirements. Here is an objective comparison based on our testing:

Tool Pros Cons Best For
cURL (Bash) Extremely Fast, minimal resource usage, ubiquitous availability. Detected by TLS fingerprinting, no JavaScript execution. API testing, file downloads, simple static pages.
Selenium / Puppeteer Full JavaScript rendering, mimics real user behavior. Very Slow (2-5s per page), high RAM/CPU consumption. Complex SPAs, sites requiring user interaction.
Scraping API (Thordata) Bypasses CAPTCHA & TLS blocks, automatic scaling, managed infrastructure. Per-request cost, external dependency. High-volume enterprise scraping, protected targets.
⚖️ Responsible Scraping Reminder
  • ✓ Always check robots.txt before scraping
  • ✓ Respect rate limits and server resources
  • ✓ Comply with website Terms of Service
  • ✓ Handle personal data per GDPR/CCPA requirements

Conclusion

Mastering cURL is an essential skill in a data engineer’s toolkit. It excels at quick diagnostics, API testing, and interacting with public APIs. However, when facing modern anti-bot systems that employ TLS fingerprinting and behavioral analysis, relying solely on cURL flags is often insufficient for sustained data collection.

For production environments where reliability and success rates are critical, consider integrating specialized tooling that handles the complexity of headers, fingerprint rotation, and proxy management automatically—allowing you to focus on data extraction logic rather than evasion techniques.

Get started for free

Frequently asked questions

How do I fix “curl: (56) Proxy CONNECT aborted”?

This error typically indicates the proxy server rejected the connection. Common causes include: invalid or expired authentication token, exceeding concurrent connection limits, or the proxy being temporarily unavailable. Verify your credentials and check your Thordata dashboard for current plan status and usage metrics.

Can cURL execute JavaScript?

No. cURL only retrieves raw HTML source code as sent by the server. If content is generated dynamically by JavaScript frameworks like React, Vue.js, or Angular, cURL will receive an empty or incomplete page. For JavaScript-rendered content, use a headless browser (Puppeteer/Playwright) or Thordata’s Universal Scraper with the js_render=true parameter.

What is the difference between -H and -A flags?

The -A flag is a shortcut specifically for setting the User-Agent header. The -H flag is a generic option to set any HTTP header, such as -H "Authorization: Bearer token123" or -H "Accept-Language: en-US". Both can set User-Agent, but -A provides more concise syntax for this common use case.

Why am I getting blocked even with a spoofed User-Agent?

Modern anti-bot systems use multiple detection vectors beyond User-Agent strings. TLS fingerprinting (JA3/JA3S) can identify cURL’s unique handshake signature, and behavioral analysis detects non-human request patterns. For protected sites, you need tools that impersonate complete browser fingerprints, not just individual headers.

About the author

Kael is a Senior Technical Copywriter at Thordata. He works closely with data engineers to document best practices for bypassing anti-bot protections. He specializes in explaining complex infrastructure concepts like residential proxies and TLS fingerprinting to developer audiences. All code examples in this article have been tested in real-world scraping scenarios.

The thordata Blog offers all its content in its original form and solely for informational intent. We do not offer any guarantees regarding the information found on the thordata Blog or any external sites that it may direct you to. It is essential that you seek legal counsel and thoroughly examine the specific terms of service of any website before engaging in any scraping endeavors, or obtain a scraping permit if required.