Check the X-RateLimit-Remaining header to avoid hitting the limit:
Node.js
Copy
const response = await fetch(url, { headers });const remaining = response.headers.get('X-RateLimit-Remaining');const reset = response.headers.get('X-RateLimit-Reset');if (remaining < 5) { console.warn(`Only ${remaining} requests remaining`); // Slow down or wait until reset}
Batch operations when possible
Instead of making multiple individual requests, batch your operations:❌ Bad: Multiple requests
Copy
for product in products: create_product(product) # 100 requests for 100 products
✅ Good: Spaced requests
Copy
import timefor i, product in enumerate(products): create_product(product) # Space out requests to stay within rate limit if (i + 1) % 15 == 0: # 15 requests per batch time.sleep(60) # Wait 1 minute
Use appropriate endpoints
Choose the right endpoint for your use case:
Use GET /api/orders/<id> (50 req/min) for fetching orders instead of polling
Set up webhooks for real-time notifications instead of constant polling
Use DELETE endpoint (10 req/min) sparingly - it’s intentionally limited
Use webhooks instead of polling
Instead of polling for new orders, use webhooks:❌ Bad: Polling every minute
Copy
while True: check_for_new_orders() # 1440 requests per day time.sleep(60)
✅ Good: Webhook notifications
Copy
@app.route('/webhook', methods=['POST'])def handle_order(): order = request.json process_order(order) # 0 API requests needed return '', 200
orders = [...] # 50 orders to updatebatch_size = 18 # Stay under 20 req/min limitfor i in range(0, len(orders), batch_size): batch = orders[i:i + batch_size] for order in batch: update_order_status(order['id'], 'dispatched', order['tracking_url']) if i + batch_size < len(orders): time.sleep(60)