Query Operations
Execute queries across tables using full-text search, vector similarity search, or hybrid approaches combining both with Reciprocal Rank Fusion (RRF).
Query Types
Full-Text Search
Uses Bleve query syntax for powerful text searching:
- Field-specific queries:
body:computer - Boolean operators:
AND,OR,NOT - Range queries:
year:>2020 - Phrase queries:
"exact phrase"
Semantic Search
Natural language queries using vector similarity:
- Searches across specified embedding indexes
- Returns semantically similar documents
- Supports multiple embedding models
Hybrid Search (RRF)
Combines full-text and semantic search using Reciprocal Rank Fusion:
{
"full_text_search": {"query": "body:computer"},
"semantic_search": "artificial intelligence",
"indexes": ["title_body_nomic"]
}Filtering and Exclusion
Refine results with powerful filtering:
Filter Prefix
Only return documents whose keys start with a specific string:
{"filter_prefix": "user:"}Returns only keys like "user:123", "user:456"
Filter Query
Apply a Bleve query as an AND condition:
{"filter_query": {"query": "+category:technology +year:>2020"}}Documents must match both the main query and filter query.
Exclusion Query
Exclude documents matching a Bleve query (NOT condition):
{"exclusion_query": {"query": "category:deprecated OR status:archived"}}Performance Note: All filters are applied before scoring and ranking, improving query performance.
RAG (Retrieval-Augmented Generation)
Execute queries and generate summaries using LLMs:
- Supports multiple retrieval queries
- Streams results as Server-Sent Events (SSE)
- Returns markdown summaries with inline citations
- Combines results from multiple tables
Retrieval Agent
Intelligent query routing with automatic query generation:
- Classifies queries as "question" or "search"
- Generates optimal search queries across tables
- Returns generated responses (for questions) or document IDs (for searches)
- Streams classification, keywords, queries, and results
Improving Search Relevance
Antfly provides several techniques to improve search result quality:
1. Hybrid Search (Combine Full-Text + Semantic)
Use both full_text_search and semantic_search together. Results are merged using
Reciprocal Rank Fusion (RRF) to balance keyword matching and semantic similarity.
2. Reranking
Apply a cross-encoder model to re-score results based on query-document relevance. Rerankers are more accurate than embedding models but slower, so use them on already-filtered results (e.g., top 50-100).
{
"semantic_search": "gaming laptops",
"limit": 100,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "description"
}
}3. Result Pruning
Filter out low-quality results by detecting score gaps or setting minimum thresholds:
{
"semantic_search": "machine learning",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30.0
}
}min_score_ratio: Keep only results scoring at least X% of the top resultmax_score_gap_percent: Stop when scores drop more than X% between consecutive results
4. Merge Strategies
Choose how full-text and semantic results are combined:
rrf(default): Reciprocal Rank Fusion - balanced, works well in most casesrsf: Relative Score Fusion - normalizes and weights scores differentlyfailover: Use full-text if embedding generation fails
5. Query Filtering
Use filter_query and filter_prefix to pre-filter before scoring, improving
both performance and relevance by removing irrelevant documents early.
Best Practice: Combine techniques for optimal results:
- Start with hybrid search (full-text + semantic)
- Apply filters to narrow the domain
- Use reranker on top results
- Apply pruner to remove outliers
- What's the difference between full-text and semantic search?
- How does hybrid search with RRF work?
- How do I improve search relevance?
- What's the difference between RAG and Retrieval Agent?
Perform a global query
/queryExecutes a query across all relevant tables and shards based on the query content.
Query Examples
Full-text search:
{
"table": "wikipedia",
"full_text_search": {"query": "body:computer"},
"limit": 10
}Semantic search:
{
"table": "articles",
"semantic_search": "artificial intelligence applications",
"indexes": ["title_body_embedding"],
"limit": 20
}Hybrid search (RRF):
{
"table": "products",
"full_text_search": {"query": "laptop gaming"},
"semantic_search": "high performance gaming computers",
"indexes": ["product_embedding"],
"filter_query": {"query": "+price:<2000 +in_stock:true"},
"fields": ["name", "price", "description"],
"limit": 15
}With filtering:
{
"table": "users",
"filter_prefix": "tenant:acme:",
"full_text_search": {"query": "active:true"},
"exclusion_query": {"query": "status:deleted"},
"limit": 50
}NDJSON format:
For bulk queries, send multiple queries as NDJSON with Content-Type: application/x-ndjson.
Each line must end with \n:
{"table":"wiki","semantic_search":"AI","indexes":["emb"],"limit":5}
{"table":"docs","full_text_search":{"query":"tutorial"},"limit":10}Security
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}Code Examples
curl -X POST "/api/v1/query" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}'const response = await fetch('/api/v1/query', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
})
});
const data = await response.json();fetch('/api/v1/query', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/query',
headers=headers,
json={
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}`)
req, _ := http.NewRequest("POST", "/api/v1/query", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"responses": [
{
"hits": {
"total": 0,
"hits": [
{
"_id": "string",
"_score": 0,
"_index_scores": {},
"_source": {}
}
],
"max_score": 0
},
"aggregations": {},
"analyses": {},
"graph_results": {},
"join_result": {
"strategy_used": "broadcast",
"left_rows_scanned": 0,
"right_rows_scanned": 0,
"rows_matched": 0,
"rows_unmatched_left": 0,
"rows_unmatched_right": 0,
"join_time_ms": 0
},
"took": 0,
"status": 0,
"error": "string",
"table": "string"
}
]
}{
"error": "An error message"
}{
"error": "An error message"
}Standalone evaluation endpoint
/evalRun evaluators on provided data without executing a query. Useful for testing evaluators, evaluating cached results, or batch evaluation.
Retrieval metrics (require ground_truth.relevant_ids and retrieved_ids):
- recall, precision, ndcg, mrr, map
LLM-as-judge metrics (require judge config):
- relevance, faithfulness, completeness, coherence, safety, helpfulness, correctness, citation_quality
Security
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
}Code Examples
curl -X POST "/api/v1/eval" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
}'const response = await fetch('/api/v1/eval', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
})
});
const data = await response.json();fetch('/api/v1/eval', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/eval',
headers=headers,
json={
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
},
"query": "string",
"output": "string",
"context": [
{}
],
"retrieved_ids": [
"string"
]
}`)
req, _ := http.NewRequest("POST", "/api/v1/eval", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"scores": {
"retrieval": {},
"generation": {}
},
"summary": {
"average_score": 0,
"passed": 0,
"failed": 0,
"total": 0
},
"duration_ms": 0
}{
"error": "An error message"
}{
"error": "An error message"
}[DEPRECATED] Answer Agent - use /agents/retrieval instead
/agents/answerDEPRECATED: Use /agents/retrieval instead. This endpoint accepts the old AnswerAgentRequest format and internally delegates to the retrieval agent for backward compatibility.
Security
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
}Code Examples
curl -X POST "/api/v1/agents/answer" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
}'const response = await fetch('/api/v1/agents/answer', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
})
});
const data = await response.json();fetch('/api/v1/agents/answer', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/agents/answer',
headers=headers,
json={
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"query": "string",
"queries": [
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
],
"with_streaming": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"agent_knowledge": "string",
"max_context_tokens": 0,
"reserve_tokens": 0,
"steps": {
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"answer": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
}
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
},
"without_generation": true
}`)
req, _ := http.NewRequest("POST", "/api/v1/agents/answer", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"answer": "string",
"answer_confidence": 0,
"context_relevance": 0,
"classification_transformation": {
"route_type": "question",
"strategy": "simple",
"semantic_mode": "rewrite",
"improved_query": "string",
"semantic_query": "string",
"step_back_query": "string",
"sub_questions": [
"string"
],
"multi_phrases": [
"string"
],
"reasoning": "string",
"confidence": 0
},
"query_results": [
{
"hits": {
"total": 0,
"hits": [
{
"_id": "string",
"_score": 0,
"_index_scores": {},
"_source": {}
}
],
"max_score": 0
},
"aggregations": {},
"analyses": {},
"graph_results": {},
"join_result": {
"strategy_used": "broadcast",
"left_rows_scanned": 0,
"right_rows_scanned": 0,
"rows_matched": 0,
"rows_unmatched_left": 0,
"rows_unmatched_right": 0,
"join_time_ms": 0
},
"took": 0,
"status": 0,
"error": "string",
"table": "string"
}
],
"followup_questions": [
"string"
],
"eval_result": {
"scores": {
"retrieval": {},
"generation": {}
},
"summary": {
"average_score": 0,
"passed": 0,
"failed": 0,
"total": 0
},
"duration_ms": 0
}
}{
"error": "An error message"
}{
"error": "An error message"
}Build a search query from natural language
/agents/query-builderUses an LLM to translate natural language search intent into a structured Bleve query. The generated query can be used directly in the QueryRequest.full_text_search or filter_query fields.
This endpoint is useful for:
- Building queries from user descriptions
- Generating example queries for a table's schema
- Agentic retrieval in RAG pipelines
Security
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
}Code Examples
curl -X POST "/api/v1/agents/query-builder" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
}'const response = await fetch('/api/v1/agents/query-builder', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
})
});
const data = await response.json();fetch('/api/v1/agents/query-builder', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/agents/query-builder',
headers=headers,
json={
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"table": "articles",
"intent": "Find all published articles about machine learning from the last year",
"schema_fields": [
"title",
"content",
"status",
"published_at"
],
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
}
}`)
req, _ := http.NewRequest("POST", "/api/v1/agents/query-builder", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"query": {
"and": [
{
"match": "machine learning",
"field": "content"
},
{
"term": "published",
"field": "status"
}
]
},
"explanation": "Searches for 'machine learning' in content field AND requires status to be exactly 'published'",
"confidence": 0.85,
"warnings": [
"Field 'category' not found in schema, using content field instead"
]
}{
"error": "An error message"
}{
"error": "An error message"
}{
"error": "An error message"
}Retrieval Agent - Agentic document retrieval with tool calling
/agents/retrievalUses a DFA-based approach to retrieve documents: clarify → select_strategy → refine_query → execute
Key Features:
- Multi-strategy: Semantic, BM25, tree, graph, metadata, or hybrid
- Query Pipeline: Chain queries with references (e.g., tree search starting from semantic results)
- Clarification: Optional multi-turn for query disambiguation
- Reasoning Chain: Returns steps taken during retrieval
Strategies:
semantic: Vector similarity search using embeddingsbm25: Full-text search with BM25 scoringmetadata: Structured field queriestree: Iterative tree navigation with summarization (PageIndex-style)graph: Relationship-based traversalhybrid: Combine strategies with RRF or rerank
SSE Event Types:
dfa_state: DFA state transitiontree_level: Tree search progresssufficiency_check: Whether collected documents are sufficienthit: Individual document resultclarification_required: Need user inputdone: Retrieval completeerror: Error occurred
Security
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
}Code Examples
curl -X POST "/api/v1/agents/retrieval" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
}'const response = await fetch('/api/v1/agents/retrieval', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
})
});
const data = await response.json();fetch('/api/v1/agents/retrieval', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/agents/retrieval',
headers=headers,
json={
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"query": "How do I configure OAuth?",
"queries": [
{
"table": "docs",
"semantic_search": "How do I configure OAuth?",
"indexes": [
"doc_embeddings"
],
"limit": 10
}
],
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"context": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"agent_knowledge": "This collection contains API documentation for the Acme product suite.",
"accumulated_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"max_iterations": 0,
"max_context_tokens": 0,
"reserve_tokens": 4000,
"stream": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"steps": {
"tools": {
"enabled_tools": [
"add_filter",
"search",
"websearch"
],
"websearch_config": {
"provider": "google",
"max_results": 1,
"timeout_ms": 0,
"safe_search": true,
"language": "en",
"region": "us"
},
"fetch_config": {
"s3_credentials": {
"endpoint": "s3.amazonaws.com",
"use_ssl": true,
"access_key_id": "your-access-key-id",
"secret_access_key": "your-secret-access-key",
"session_token": "your-session-token"
},
"max_content_length": 0,
"allowed_hosts": [
"string"
],
"block_private_ips": true,
"max_download_size_bytes": 0,
"timeout_seconds": 0
},
"max_tool_iterations": 1
},
"classification": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"with_reasoning": true,
"force_strategy": "simple",
"force_semantic_mode": "rewrite",
"multi_phrase_count": 1
},
"generation": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"system_prompt": "string",
"generation_context": "Be concise and technical. Include code examples where relevant."
},
"followup": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"count": 1,
"context": "Focus on implementation details and edge cases"
},
"confidence": {
"enabled": true,
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"chain": [
{
"generator": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"retry": {
"max_attempts": 1,
"initial_backoff_ms": 100,
"backoff_multiplier": 1,
"max_backoff_ms": 0
},
"condition": "always"
}
],
"context": "Be conservative - only give high confidence if resources directly address the question"
},
"eval": {
"evaluators": [
"recall"
],
"judge": {
"provider": "openai",
"model": "gpt-4.1",
"temperature": 0.7,
"max_tokens": 2048
},
"ground_truth": {
"relevant_ids": [
"string"
],
"expectations": "string"
},
"options": {
"k": 1,
"pass_threshold": 0,
"timeout_seconds": 1
}
}
},
"document_renderer": "{{encodeToon this.fields}}"
}`)
req, _ := http.NewRequest("POST", "/api/v1/agents/retrieval", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"hits": [
{
"_id": "string",
"_score": 0,
"_index_scores": {},
"_source": {}
}
],
"reasoning_chain": [
{
"step": "semantic_search",
"action": "Searched for OAuth configuration in doc_embeddings index",
"details": {}
}
],
"strategy_used": "semantic",
"state": "tool_calling",
"clarification_request": {
"question": "Did you mean OAuth 1.0 or OAuth 2.0?",
"options": [
"OAuth 1.0",
"OAuth 2.0",
"Both"
],
"reason": "The query mentions OAuth but doesn't specify which version"
},
"applied_filters": [
{
"field": "string",
"operator": "eq",
"value": null
}
],
"tool_calls_made": 0,
"messages": [
{
"role": "user",
"content": "string",
"tool_calls": [
{
"id": "string",
"name": "string",
"arguments": {}
}
],
"tool_results": [
{
"tool_call_id": "string",
"result": {},
"error": "string"
}
]
}
],
"classification": {
"route_type": "question",
"strategy": "simple",
"semantic_mode": "rewrite",
"improved_query": "string",
"semantic_query": "string",
"step_back_query": "string",
"sub_questions": [
"string"
],
"multi_phrases": [
"string"
],
"reasoning": "string",
"confidence": 0
},
"generation": "string",
"generation_confidence": 0,
"context_relevance": 0,
"followup_questions": [
"string"
],
"eval_result": {
"scores": {
"retrieval": {},
"generation": {}
},
"summary": {
"average_score": 0,
"passed": 0,
"failed": 0,
"total": 0
},
"duration_ms": 0
}
}{
"error": "An error message"
}{
"error": "An error message"
}Query a specific table
/tables/{tableName}/querySecurity
Provide your bearer token in the Authorization header when making requests to protected resources.
Example: Authorization: Bearer YOUR_API_KEY
Request Body
Example:
{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}Code Examples
curl -X POST "/api/v1/tables/{tableName}/query" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}'const response = await fetch('/api/v1/tables/{tableName}/query', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
})
});
const data = await response.json();fetch('/api/v1/tables/{tableName}/query', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
})
})
.then(response => response.json())
.then(data => console.log(data));import requests
headers = {
'Authorization': 'Bearer YOUR_API_KEY'
}
response = requests.post(
'/api/v1/tables/{tableName}/query',
headers=headers,
json={
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}
)
data = response.json()package main
import (
"bytes"
"encoding/json"
"net/http"
)
func main() {
body := []byte(`{
"table": "wikipedia",
"full_text_search": {
"query": "+body:computer +category:technology",
"boost": 1
},
"semantic_search": "artificial intelligence and machine learning applications",
"embedding_template": "{{remotePDF url=this}}",
"indexes": [
"title_body_nomic",
"description_embedding"
],
"filter_prefix": "string",
"filter_query": {
"query": "+category:technology +year:>2020",
"boost": 1
},
"exclusion_query": {
"query": "category:deprecated OR status:archived",
"boost": 1
},
"aggregations": {},
"embeddings": {},
"fields": [
"title",
"url",
"summary",
"created_at"
],
"limit": 20,
"offset": 0,
"order_by": {
"created_at": true,
"score": true
},
"distance_under": 0.5,
"distance_over": 0.1,
"merge_config": {
"strategy": "rrf",
"weights": {
"full_text": 0.3,
"title_embedding": 1
},
"window_size": 1,
"rank_constant": 0
},
"count": false,
"reranker": {
"provider": "ollama",
"model": "dengcao/Qwen3-Reranker-0.6B:F16",
"field": "content"
},
"analyses": {
"pca": true,
"tsne": true
},
"graph_searches": {},
"expand_strategy": "union",
"document_renderer": "{{encodeToon this.fields}}",
"pruner": {
"min_score_ratio": 0.5,
"max_score_gap_percent": 30,
"min_absolute_score": 0.01,
"require_multi_index": true,
"std_dev_threshold": 1.5
},
"join": {
"right_table": "customers",
"join_type": "inner",
"on": {
"left_field": "customer_id",
"right_field": "id",
"operator": "eq"
},
"right_filters": {
"filter_query": {
"term": "string",
"field": "string",
"boost": 0
},
"filter_prefix": "string",
"limit": 0
},
"right_fields": [
"name",
"email",
"tier"
],
"strategy_hint": "broadcast",
"nested_join": "..."
},
"foreign_sources": {}
}`)
req, _ := http.NewRequest("POST", "/api/v1/tables/{tableName}/query", bytes.NewBuffer(body))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(req)
defer resp.Body.Close()
}Responses
{
"responses": [
{
"hits": {
"total": 0,
"hits": [
{
"_id": "string",
"_score": 0,
"_index_scores": {},
"_source": {}
}
],
"max_score": 0
},
"aggregations": {},
"analyses": {},
"graph_results": {},
"join_result": {
"strategy_used": "broadcast",
"left_rows_scanned": 0,
"right_rows_scanned": 0,
"rows_matched": 0,
"rows_unmatched_left": 0,
"rows_unmatched_right": 0,
"join_time_ms": 0
},
"took": 0,
"status": 0,
"error": "string",
"table": "string"
}
]
}