# Features

The Perplexity Scraper provides powerful customization options to help you generate more accurate, contextual, and verifiable responses from Perplexity. These features allow you to tailor how Perplexity behaves, what type of responses it returns, and how much detail it extracts.

***

### Geolocation

Control and influence Perplexity’s responses based on a target geographic region.\
Geolocation parameters affect examples, perspectives, terminology, and localized insights within the generated answer.

Geolocation supports **country-level** and **state / city-level** targeting.

***

#### Country

Influence Perplexity’s responses by specifying a target country.

This can affect examples, perspectives, or localized insights within the answer.

**Country format**\
The `country` value must be a **2-letter country code**.

**Example**

```json
{
  "prompt": "Trending products in 2025",
  "engine":"perplexity",
  "country": "ca"
}
```

**Use Cases**

* Localized content generation
* Region-specific trend insights
* Country-influenced context within Google AI Mode answers

***

#### State and City

Refine geolocation further by targeting a specific **state** or **state + city**.

The `state_city` parameter allows you to influence responses with more granular regional context, such as local brands, stores, events, or regulations.

> **Important**\
> The `country` parameter is **required** when using `state_city`.

**State / City format**

* State only
* State and city
* State + City must be separated by underscore

**Examples**

**State only**

```json
{
  "prompt": "Trending products in 2025",
  "country": "us",
  "state_city": "california"
}
```

**State and city**

```json
{
  "prompt": "Trending products in 2025",
  "country": "us",
  "state_city": "california_losangeles"
}
```

**Use Cases**

* Hyper-local trend analysis
* City-specific product or business insights
* Location-aware content and recommendations

***

### **Callback URL (callback\_url)**

\
Send the LLM Scraper response directly to your own API endpoint.\
When `callback_url` is provided, the request is acknowledged and the final LLM Scraper response is delivered to your endpoint via a POST request.

**Example**

```json
{
  "prompt": "Summarize the main benefits of residential proxies",
  "engine":"perplexity",
  "callback_url": "https://user-api.com"
}
```

**Initial Response (202 Accepted)**\
When `callback_url` is included, the API returns a confirmation response indicating that the request was received.

```json
{
  "message": "request received",
  "traceId": "123e4567-e89b-12d3-a456-426614174000",
  "response_url": "https://user-api.com"
}
```

**Final Response**\
Once processing is completed, the full LLM Scraper response will be sent to the provided `callback_url`.\
The response structure is identical to the standard LLM Scraper response returned in synchronous requests.

**Use Cases**

* Deliver results directly into backend workflows
* Separate request handling from response processing
* Trigger automation pipelines after completion

***

### **Error Handling**

The AI Mode Scraper includes standardized error responses.

#### **400 — Validation Errors**

Returned when:

* Prompt missing
* Invalid parameter
* Prompt > 4096 characters

Example:

```json
{
  "message": "Prompt exceeds 4096 characters. Please shorten your prompt.",
  "error": "Bad Request",
  "statusCode": 400
}
```

***

#### **401 — Credentials Error**

Returned when:

* Username or Password is Incorrect
* No Active Package for the Product

Example:

```json
{
    "message": "Request failed with status code 401"
}
```

***

#### **500 — Internal Errors**

Every Error of a Scraping error will return a 500 error code like so:

```json
{
  "message": "Perplexity scraper could not complete the request. Please try again.",
  "error": "Internal Scraper Error",
  "statusCode": 500
}
```
