Introducing the Forecasts API — Event-driven forecasts for precise demand planning. Fast, accurate, and easy to run.
Explore Now
LogoLogo
Visit websiteWebAppGet DemoTry for Free
  • Introduction
  • Swagger UI
  • Loop
  • System Status
  • Getting Started
    • API Quickstart
    • Data Science Notebooks
    • PredictHQ Data
      • Data Accuracy
      • Event Categories
        • Attendance-Based Events
        • Non-Attendance-Based Events
        • Unscheduled Events
        • Live TV Events
      • Labels
      • Entities
      • Ranks
        • PHQ Rank
        • Local Rank
        • Aviation Rank
      • Predicted Attendance
      • Predicted End Times
      • Predicted Event Spend
      • Predicted Events
      • Predicted Impact Patterns
    • Guides
      • Geolocation Guides
        • Overview
        • Searching by Location
          • Find Events by Latitude/Longitude and Radius
          • Find Events by Place ID
          • Find Events by IATA Code
          • Find Events by Country Code
          • Find Events by Placekey
          • Working with Location-Based Subscriptions
        • Understanding Place Hierarchies
        • Working with Polygons
        • Join Events using Placekey
      • Date and Time Guides
        • Working with Recurring Events
        • Working with Multi-day and Umbrella Events
        • Working with Dates, Times and Timezones
      • Events API Guides
        • Understanding Relevance Field in Event Results
        • Attendance-Based Events Notebooks
        • Non-Attendance-Based Events Notebooks
        • Severe Weather Events Notebooks
        • Academic Events Notebooks
        • Working with Venues Notebook
      • Features API Guides
        • Increase Accuracy with the Features API
        • Get ML Features
        • Demand Forecasting with Event Features
      • Forecasts API Guides
        • Getting Started with Forecasts API
        • Understanding Forecast Accuracy Metrics
        • Troubleshooting Guide for Forecasts API
      • Live TV Event Guides
        • Find Broadcasts by County Place ID
        • Find Broadcasts by Latitude and Longitude
        • Find all Broadcasts for an Event
        • Find Broadcasts for Specific Sport Types
        • Aggregating Live TV Events
        • Live TV Events Notebooks
      • Beam Guides
        • ML Features by Location
        • ML Features by Group
      • Demand Surge API Guides
        • Demand Surge Notebook
      • Guide to Protecting PredictHQ Data
      • Streamlit Demo Apps
      • Guide to Bulk Export Data via the WebApp
      • Industry-Specific Event Filters
      • Using the Snowflake Retail Sample Dataset
      • Tutorials
        • Filtering and Finding Relevant Events
        • Improving Demand Forecasting Models with Event Features
        • Using Event Data in Power BI
        • Using Event Data in Tableau
        • Connecting to PredictHQ APIs with Microsoft Excel
        • Loading Event Data into a Data Warehouse
        • Displaying Events in a Heatmap Calendar
        • Displaying Events on a Map
    • Tutorials by Use Case
      • Demand Forecasting with ML Models
      • Dynamic Pricing
      • Inventory Management
      • Workforce Optimization
      • Visualization and Insights
  • Integrations
    • Integration Guides
      • Keep Data Updated via API
      • Integrate with Beam
      • Integrate with Loop Links
    • Third-Party Integrations
      • Receive Data via Snowflake
        • Example SQL Queries for Snowflake
        • Snowflake Data Science Guide
          • Snowpark Method Guide
          • SQL Method Guide
      • Receive Data via AWS Data Exchange
        • CSV/Parquet Data Structure for ADX
        • NDJSON Data Structure for ADX
      • Integrate with Databricks
      • Integrate with Tableau
      • Integrate with a Demand Forecast in PowerBI
      • Google Cloud BigQuery
    • PredictHQ SDKs
      • Python SDK
      • Javascript SDK
  • API Reference
    • API Overview
      • Authenticating
      • API Specs
      • Rate Limits
      • Pagination
      • API Changes
      • Attribution
      • Troubleshooting
    • Events
      • Search Events
      • Get Event Counts
    • Broadcasts
      • Search Broadcasts
      • Get Broadcasts Count
    • Features
      • Get ML Features
    • Forecasts
      • Models
        • Create Model
        • Update Model
        • Replace Model
        • Delete Model
        • Search Models
        • Get Model
        • Train Model
      • Demand Data
        • Upload Demand Data
        • Get Demand Data
      • Forecasts
        • Get Forecast
      • Algorithms
        • Get Algorithms
    • Beam
      • Create an Analysis
      • Upload Demand Data
      • Search Analyses
      • Get an Analysis
      • Update an Analysis
      • Partially Update an Analysis
      • Get Correlation Results
      • Get Feature Importance
      • Refresh an Analysis
      • Delete an Analysis
      • Analysis Groups
        • Create an Analysis Group
        • Get an Analysis Group
        • Search Analysis Groups
        • Update an Analysis Group
        • Partially Update an Analysis Group
        • Refresh an Analysis Group
        • Delete an Analysis Group
        • Get Feature Importance for an Analysis Group
    • Demand Surge
      • Get Demand Surges
    • Suggested Radius
      • Get Suggested Radius
    • Saved Locations
      • Create a Saved Location
      • Search Saved Locations
      • Get a Saved Location
      • Search Events for a Saved Location
      • Update a Saved Location
      • Delete a Saved Location
    • Loop
      • Loop Links
        • Create a Loop Link
        • Search Loop Links
        • Get a Loop Link
        • Update a Loop Link
        • Delete a Loop Link
      • Loop Settings
        • Get Loop Settings
        • Update Loop Settings
      • Loop Submissions
        • Search Submitted Events
      • Loop Feedback
        • Search Feedback
    • Places
      • Search Places
      • Get Place Hierarchies
  • WebApp Support
    • WebApp Overview
      • Using the WebApp
      • API Tools
      • Events Search
      • How to Create an API Token
    • Getting Started
      • Can I Give PredictHQ a Go on a Free Trial Basis?
      • How Do I Get in Touch if I Need Help?
      • Using AWS Data Exchange to Access PredictHQ Events Data
      • Using Snowflake to Access PredictHQ Events Data
      • What Happens at the End of My Free Trial?
      • Export Events Data from the WebApp
    • Account Management
      • Managing your Account Settings
      • How Do I Change My Name in My Account?
      • How Do I Change My Password?
      • How Do I Delete My Account?
      • How Do I Invite People Into My Organization?
      • How Do I Log In With My Google or LinkedIn Account?
      • How Do I Update My Email Address?
      • I Signed Up Using My Google/LinkedIn Account, but I Want To Log In With My Own Email
    • API Plans, Pricing & Billing
      • Do I Need To Provide Credit Card Details for the 14-Day Trial?
      • How Do I Cancel My API Subscription?
      • Learn About Our 14-Day Trial
      • What Are the Definitions for "Storing" and "Caching"?
      • What Attribution Do I Have To Give PredictHQ?
      • What Does "Commercial Use" Mean?
      • What Happens If I Go Over My API Plan's Rate Limit?
    • FAQ
      • How Does PredictHQ Support Placekey?
      • Using Power BI and Tableau With PredictHQ Data
      • Can I Download a CSV of Your Data?
      • Can I Suggest a New Event Category?
      • Does PredictHQ Have Historical Event Data?
      • Is There a PredictHQ Mobile App?
      • What Are Labels?
      • What Countries Do You Have School Holidays For?
      • What Do The Different Event Ranks Mean?
      • What Does Event Visibility Window Mean?
      • What Is the Difference Between an Observed Holiday and an Observance?
    • Tools
      • Is PHQ Attendance Available for All Categories?
      • See Event Trends in the WebApp
      • What is Event Trends?
      • Live TV Events
        • What is Live TV Events?
        • Can You Access Live TV Events via the WebApp?
        • How Do I Integrate Live TV Events into Forecasting Models?
      • Labels
        • What Does the Closed-Doors Label Mean?
    • Beam (Relevancy Engine)
      • An Overview of Beam - Relevancy Engine
      • Creating an Analysis in Beam
      • Uploading Your Demand Data to Beam
      • Viewing the List of Analysis in Beam
      • Viewing the Table of Results in Beam
      • Viewing the Category Importance Information in Beam
      • Feature Importance With Beam - Find the ML Features to Use in Your Forecasts
      • Beam Value Quantification
      • Exporting Correlation Data With Beam
      • Getting More Details on a Date on the Beam Graph
      • Grouping Analyses in Beam
      • Using the Beam Graph
      • Viewing the Time Series Impact Analysis in Beam
    • Location Insights
      • An Overview of Location Insights
      • How to Set a Default Location
      • How Do I Add a Location?
      • How Do I Edit a Location?
      • How Do I Share Location Insights With My Team?
      • How Do I View Details for One Location?
      • How Do I View My Saved Locations as a List?
      • Search and View Event Impact in Location Insights
      • What Do Each of the Columns Mean?
      • What Is the Difference Between Center Point & Radius and City, State, Country?
Powered by GitBook

PredictHQ

  • Terms of Service
  • Privacy Policy
  • GitHub

© 2025 PredictHQ Ltd

On this page
  • Overview
  • SageMaker Demo
  • Requirements
  • Forecasting Workflow
  • Prepare Your Data
  • Create a Model
  • Evaluate Forecast Model
  • Retrieve Forecast
  • Ongoing Forecasting
  • Explainability
  • Tips for Better Forecasts
  • Troubleshooting
  • Next Steps

Was this helpful?

  1. Getting Started
  2. Guides
  3. Forecasts API Guides

Getting Started with Forecasts API

PreviousForecasts API GuidesNextUnderstanding Forecast Accuracy Metrics

Last updated 19 days ago

Was this helpful?

Overview

The Forecasts API delivers fast, accurate, and scalable demand forecasts—powered by the real-world events that impact your business. Whether you’re starting from scratch or augmenting an existing model, our event-driven forecasting approach improves accuracy, unlocking significant ROI and cutting development time by months.

This API provides ready-to-use, event-optimized forecasts for your business, embedding the impact of sports, concerts, school holidays, and more directly into the forecast output. There’s no need to source or model event effects separately—we handle it for you.

Why Use It?

  • Event-aware by default — real-world events are built into every forecast

  • Industry-specific performance — designed for demand planners, revenue managers, and ops teams

  • Faster and more affordable than building your own system

PredictHQ’s Forecasts API is the only event-driven, fully automated forecasting solution available—built to get you to accurate forecasts without the complexity.

SageMaker Demo

Forecasts API can be used anywhere you can run code (SageMaker, Snowflake, Databricks etc). The demo here is running in AWS SageMaker.

Requirements

All code snippets in this guide assume the appropriate config has already been set:

PHQ_API_TOKEN = os.getenv("PHQ_API_TOKEN") or "REPLACE_WITH_YOUR_ACCESS_TOKEN"
API_URL = "https://api.predicthq.com"

headers = {
    "Authorization": f"Bearer {PHQ_API_TOKEN}",
    "Content-Type": "application/json"
}

lat = 51.50396
lon = 0.00476
industry = "restaurants"
name = "Sample Restaurant Location"

Forecasting Workflow

Prepare Your Data

To generate a forecast, you need to provide a daily time series with two columns:

Column
Description

date

The date of the observation, in YYYY-MM-DD format (ISO 8601).

demand

The actual demand value for that date (e.g. units sold, bookings).

Requirements:

  • The data must be daily level

  • Provide at least 18 months of history for best results

  • Demand data will be rejected if it contains duplicated dates, missing values in the demand column, or non-numeric demand values

Example:

date,demand
2023-02-03,17696
2023-02-04,28718
2023-02-05,24442
2023-02-06,13468
2023-02-07,12600
2023-02-08,13671
2023-02-09,13324
2023-02-10,16589

Create a Model

All forecast models are tied to a Saved Location so you can define the location once and create multiple models for it. For this example we're going to look at a theoretical restaurant located by the O2 Arena in London.

Create Saved Location (Using Suggested Radius)

Our Suggested Radius API calculates the optimal area around your business to capture the events that will provide an impact.

# Get suggested radius
response = requests.get(
    url=f"{API_URL}/v1/suggested-radius/",
    headers=headers,
    params={
        "location.origin": f"{lat},{lon}",
        "industry": industry,
        "radius_unit": "mi",
    },
)

data = response.json()
radius = data["radius"]
radius_unit = data["radius_unit"]

print(f"Suggested radius: {radius} {radius_unit}")

# Suggested radius: 1.11 mi
# Create Saved Location
response = requests.post(
    url=f"{API_URL}/v1/saved-locations",
    headers=headers,
    data=json.dumps(
        {
            "name": name,
            "geojson": {
                "type": "Feature",
                "properties": {"radius": radius, "radius_unit": radius_unit},
                "geometry": {
                    "type": "Point",
                    "coordinates": [lon, lat],  # GeoJSON order is lon,lat
                },
            },
        }
    ),
)

location_id = response.json()["location_id"]
print(f"Saved location ID: {location_id}")

# Saved location ID: -ErnOilZkeP6P6CPdcXvTg

After creating the Saved Location, we can re-use it across as many forecast models as we need.

Create a Model

# Define model
response = requests.post(
    url=f"{API_URL}/v1/forecasts/models",
    headers=headers,
    json={
        "name": f"{name} Forecast",
        "location": {"saved_location_id": location_id},
        "algo": "phq-xgboost",
        "forecast_window": "7d",
        "demand_type": {
            "industry": industry,
        },
    },
)

model_id = response.json()["model_id"]
print(f"Model ID: {model_id}")

# Model ID: Oa1D2XvT-IXfFQ_osoTZjQ

Upload Demand Data

# Upload demand
sample_demand_df = pd.read_csv("data/sample_demand.csv")
sample_demand_json = sample_demand_df.to_json(orient="records")

response = requests.post(
    url=f"{API_URL}/v1/forecasts/models/{model_id}/demand",
    headers=headers,
    json={"demand": json.loads(sample_demand_json)},
)

print(f"Demand upload: {'Successful' if response.status_code == 201 else 'Failed'}")

# Demand upload: Successful

Train the Model

During the training process the demand will be analyzed by Beam to determine what types of events impact your demand. This includes correlation and feature importance testing. The important features (from Features API) will be used when training your model and when forecasting.

# Train model
response = requests.post(
    url=f"{API_URL}/v1/forecasts/models/{model_id}/train",
    headers=headers,
)

print(f"Model training: {'Successful' if response.status_code == 204 else 'Failed'}")

# Model training: Successful

Training usually takes a few minutes.

Evaluate Forecast Model

Use evaluation metrics such as MAPE to compare the model performance to other models, benchmarks, etc. In this example, the benchmark model had a MAPE of 8.96%.

# Get evaluation results
response = requests.get(
    url=f"{API_URL}/v1/forecasts/models/{model_id}",
    headers=headers,
)

print(f"Evaluation metrics: {response.json()['model']['metrics']}")

"""
Evaluation metrics:
{
    'accuracy': {
        'mape': 8.96,
        'mae': 1708.8,
        'rmse': 2259.08
    },
    'demand_data': {
        'date_range': {
            'start': '2023-02-03',
            'end': '2023-08-02'
        }
    },
    'training_data': {
        'date_range': {
            'start': '2023-02-03',
            'end': '2023-08-02'
        },
        'missing_pct': 0.0,
        'missing_dates': []
    }
}
"""

Retrieve Forecast

# Get forecast
response = requests.get(
    url=f"{API_URL}/v1/forecasts/models/{model_id}/forecast",
    headers=headers,
    params={
        "date.gte": "2023-08-03",
        "include": "phq_explainability"
    },
)

results = response.json()["results"]
forecasts_df = pd.DataFrame(results)

Visualize the actual demand we uploaded as well as the forecasted demand we just retrieved:

Ongoing Forecasting

After you have trained a model you can keep using that model in your ongoing workflow.

Explainability

Every date in the forecast response includes a forecast value—that’s the core output you’ll use. Optionally, you can request explainability to get additional context on why the model predicted that value for a given day. This includes a list of impactful real-world events (e.g. school holidays, concerts) that the model considered significant for that date. There are 2 key pieces of explainability that can be provided:

  • phq_explainability - Top events the model has determined are impacting your demand on this date.

  • phq_features - List of features (from Features API) that were identified through Beam's Feature Importance process as relevant to your demand, as well as their values. This field is only available to customers who have also purchase our Features product.

Explainability is optional—use phq_explainability in your include query param to enable it.

Here's an example truncated response for a single date showing phq_explainability:

{
  "date": "2023-08-03",
  "forecast": 18001.67,
  "phq_explainability": {
    "events": [
      {
        "id": "GqQA6oSLn8CGBao3vM",
        "category": "school-holidays",
        "title": "Newham - Summer Holidays",
        "start_local": "2023-07-22T00:00:00",
        "end_local": "2023-08-31T23:59:59",
        "phq_rank": 86,
        "local_rank": 69
      },
      {
        "id": "AZTohndL3PcdjwGjje",
        "category": "performing-arts",
        "title": "Mamma Mia! the Party",
        "start_local": "2023-08-03T18:30:00",
        "end_local": "2023-08-03T18:30:00",
        "phq_rank": 60,
        "local_rank": 69
      },
      ...
    ]
  }
}

Tips for Better Forecasts

To get the most accurate results from the Forecasts API, your input data needs to reflect meaningful demand patterns over time. Here are key tips to improve forecast performance and reliability:

  • Include enough history: At least 18 months of daily demand helps the model learn seasonal and event-driven patterns.

  • Keep it consistent: Submit clean, continuous daily data—no smoothing, gaps, or placeholder values.

  • Avoid over-segmentation: Low-volume or highly granular series often perform worse. Aggregate where possible.

  • Watch out for tiny values: Very small but non-zero demand can distort percentage-based metrics like MAPE.

  • Exclude outliers if needed: Remove early COVID-19 disruptions or other non-repeating anomalies if they don’t reflect current demand.

Troubleshooting

The guide covers topics like:

  • What to do if your forecast accuracy is poor (e.g. noisy or low-volume data)

  • Why not enough history can reduce model performance

  • How overly fine-grained series can lead to weak signals

  • When to remove early COVID-19 disruptions from your dataset

Before tweaking your inputs or retrying, we strongly recommend reviewing the troubleshooting guide—it can save a lot of time and guesswork.

Next Steps

Before you get started make sure you have an .

to run an example yourself and adapt it to your needs.

Lower values indicate better accuracy. See the guide for help interpreting MAPE, MAE, and RMSE.

For more detailed recommendations, see the .

If your forecasts aren’t meeting expectations, don’t worry—there are several common reasons why accuracy might be lower than expected. We’ve put together a dedicated to help you identify and resolve these issues.

We also have a guide on (MAPE, MAE, RMSE) meaningfully.

- Full schema, endpoints and parameters

- Guide to interpreting MAPE, MAE and RMSE

- Common causes of low accuracy and how to fix them

Use our Notebook
Understanding Forecast Accuracy Metrics
Troubleshooting Guide
Troubleshooting Guide
understanding forecast accuracy metrics
Forecasts API Reference
Understanding Forecast Accuracy Metrics
Troubleshooting Guide for Forecasts API
Forecasts API Notebook Run-Through in AWS SageMaker
API Token
Time series chart showing the actual and forecasted demand