Introducing the Forecasts API — Event-driven forecasts for precise demand planning. Fast, accurate, and easy to run.
Explore Now
LogoLogo
Visit websiteWebAppGet DemoTry for Free
  • Introduction
  • Swagger UI
  • Loop
  • System Status
  • Getting Started
    • API Quickstart
    • Data Science Notebooks
    • PredictHQ Data
      • Data Accuracy
      • Event Categories
        • Attendance-Based Events
        • Non-Attendance-Based Events
        • Unscheduled Events
        • Live TV Events
      • Labels
      • Entities
      • Ranks
        • PHQ Rank
        • Local Rank
        • Aviation Rank
      • Predicted Attendance
      • Predicted End Times
      • Predicted Event Spend
      • Predicted Events
      • Predicted Impact Patterns
    • Guides
      • Geolocation Guides
        • Overview
        • Searching by Location
          • Find Events by Latitude/Longitude and Radius
          • Find Events by Place ID
          • Find Events by IATA Code
          • Find Events by Country Code
          • Find Events by Placekey
          • Working with Location-Based Subscriptions
        • Understanding Place Hierarchies
        • Working with Polygons
        • Join Events using Placekey
      • Date and Time Guides
        • Working with Recurring Events
        • Working with Multi-day and Umbrella Events
        • Working with Dates, Times and Timezones
      • Events API Guides
        • Understanding Relevance Field in Event Results
        • Attendance-Based Events Notebooks
        • Non-Attendance-Based Events Notebooks
        • Severe Weather Events Notebooks
        • Academic Events Notebooks
        • Working with Venues Notebook
      • Features API Guides
        • Increase Accuracy with the Features API
        • Get ML Features
        • Demand Forecasting with Event Features
      • Forecasts API Guides
        • Getting Started with Forecasts API
        • Understanding Forecast Accuracy Metrics
        • Troubleshooting Guide for Forecasts API
      • Live TV Event Guides
        • Find Broadcasts by County Place ID
        • Find Broadcasts by Latitude and Longitude
        • Find all Broadcasts for an Event
        • Find Broadcasts for Specific Sport Types
        • Aggregating Live TV Events
        • Live TV Events Notebooks
      • Beam Guides
        • ML Features by Location
        • ML Features by Group
      • Demand Surge API Guides
        • Demand Surge Notebook
      • Guide to Protecting PredictHQ Data
      • Streamlit Demo Apps
      • Guide to Bulk Export Data via the WebApp
      • Industry-Specific Event Filters
      • Tutorials
        • Filtering and Finding Relevant Events
        • Improving Demand Forecasting Models with Event Features
        • Using Event Data in Power BI
        • Using Event Data in Tableau
        • Connecting to PredictHQ APIs with Microsoft Excel
        • Loading Event Data into a Data Warehouse
        • Displaying Events in a Heatmap Calendar
        • Displaying Events on a Map
    • Tutorials by Use Case
      • Demand Forecasting with ML Models
      • Dynamic Pricing
      • Inventory Management
      • Workforce Optimization
      • Visualization and Insights
  • Integrations
    • Integration Guides
      • Keep Data Updated via API
      • Integrate with Beam
      • Integrate with Loop Links
    • Third-Party Integrations
      • Receive Data via Snowflake
        • Example SQL Queries for Snowflake
        • Snowflake Data Science Guide
          • Snowpark Method Guide
          • SQL Method Guide
      • Receive Data via AWS Data Exchange
        • CSV/Parquet Data Structure for ADX
        • NDJSON Data Structure for ADX
      • Integrate with Databricks
      • Integrate with Tableau
      • Integrate with a Demand Forecast in PowerBI
      • Google Cloud BigQuery
    • PredictHQ SDKs
      • Python SDK
      • Javascript SDK
  • API Reference
    • API Overview
      • Authenticating
      • API Specs
      • Rate Limits
      • Pagination
      • API Changes
      • Attribution
      • Troubleshooting
    • Events
      • Search Events
      • Get Event Counts
    • Broadcasts
      • Search Broadcasts
      • Get Broadcasts Count
    • Features
      • Get ML Features
    • Forecasts
      • Models
        • Create Model
        • Update Model
        • Replace Model
        • Delete Model
        • Search Models
        • Get Model
        • Train Model
      • Demand Data
        • Upload Demand Data
        • Get Demand Data
      • Forecasts
        • Get Forecast
      • Algorithms
        • Get Algorithms
    • Beam
      • Create an Analysis
      • Upload Demand Data
      • Search Analyses
      • Get an Analysis
      • Update an Analysis
      • Partially Update an Analysis
      • Get Correlation Results
      • Get Feature Importance
      • Refresh an Analysis
      • Delete an Analysis
      • Analysis Groups
        • Create an Analysis Group
        • Get an Analysis Group
        • Search Analysis Groups
        • Update an Analysis Group
        • Partially Update an Analysis Group
        • Refresh an Analysis Group
        • Delete an Analysis Group
        • Get Feature Importance for an Analysis Group
    • Demand Surge
      • Get Demand Surges
    • Suggested Radius
      • Get Suggested Radius
    • Saved Locations
      • Create a Saved Location
      • Search Saved Locations
      • Get a Saved Location
      • Search Events for a Saved Location
      • Update a Saved Location
      • Delete a Saved Location
    • Loop
      • Loop Links
        • Create a Loop Link
        • Search Loop Links
        • Get a Loop Link
        • Update a Loop Link
        • Delete a Loop Link
      • Loop Settings
        • Get Loop Settings
        • Update Loop Settings
      • Loop Submissions
        • Search Submitted Events
      • Loop Feedback
        • Search Feedback
    • Places
      • Search Places
      • Get Place Hierarchies
  • WebApp Support
    • WebApp Overview
      • Using the WebApp
      • API Tools
      • Events Search
      • How to Create an API Token
    • Getting Started
      • Can I Give PredictHQ a Go on a Free Trial Basis?
      • How Do I Get in Touch if I Need Help?
      • Using AWS Data Exchange to Access PredictHQ Events Data
      • Using Snowflake to Access PredictHQ Events Data
      • What Happens at the End of My Free Trial?
      • Export Events Data from the WebApp
    • Account Management
      • Managing your Account Settings
      • How Do I Change My Name in My Account?
      • How Do I Change My Password?
      • How Do I Delete My Account?
      • How Do I Invite People Into My Organization?
      • How Do I Log In With My Google or LinkedIn Account?
      • How Do I Update My Email Address?
      • I Signed Up Using My Google/LinkedIn Account, but I Want To Log In With My Own Email
    • API Plans, Pricing & Billing
      • Do I Need To Provide Credit Card Details for the 14-Day Trial?
      • How Do I Cancel My API Subscription?
      • Learn About Our 14-Day Trial
      • What Are the Definitions for "Storing" and "Caching"?
      • What Attribution Do I Have To Give PredictHQ?
      • What Does "Commercial Use" Mean?
      • What Happens If I Go Over My API Plan's Rate Limit?
    • FAQ
      • How Does PredictHQ Support Placekey?
      • Using Power BI and Tableau With PredictHQ Data
      • Can I Download a CSV of Your Data?
      • Can I Suggest a New Event Category?
      • Does PredictHQ Have Historical Event Data?
      • Is There a PredictHQ Mobile App?
      • What Are Labels?
      • What Countries Do You Have School Holidays For?
      • What Do The Different Event Ranks Mean?
      • What Does Event Visibility Window Mean?
      • What Is the Difference Between an Observed Holiday and an Observance?
    • Tools
      • Is PHQ Attendance Available for All Categories?
      • See Event Trends in the WebApp
      • What is Event Trends?
      • Live TV Events
        • What is Live TV Events?
        • Can You Access Live TV Events via the WebApp?
        • How Do I Integrate Live TV Events into Forecasting Models?
      • Labels
        • What Does the Closed-Doors Label Mean?
    • Beam (Relevancy Engine)
      • An Overview of Beam - Relevancy Engine
      • Creating an Analysis in Beam
      • Uploading Your Demand Data to Beam
      • Viewing the List of Analysis in Beam
      • Viewing the Table of Results in Beam
      • Viewing the Category Importance Information in Beam
      • Feature Importance With Beam - Find the ML Features to Use in Your Forecasts
      • Beam Value Quantification
      • Exporting Correlation Data With Beam
      • Getting More Details on a Date on the Beam Graph
      • Grouping Analyses in Beam
      • Using the Beam Graph
      • Viewing the Time Series Impact Analysis in Beam
    • Location Insights
      • An Overview of Location Insights
      • How to Set a Default Location
      • How Do I Add a Location?
      • How Do I Edit a Location?
      • How Do I Share Location Insights With My Team?
      • How Do I View Details for One Location?
      • How Do I View My Saved Locations as a List?
      • Search and View Event Impact in Location Insights
      • What Do Each of the Columns Mean?
      • What Is the Difference Between Center Point & Radius and City, State, Country?
Powered by GitBook

PredictHQ

  • Terms of Service
  • Privacy Policy
  • GitHub

© 2025 PredictHQ Ltd

On this page
  • Use Cases
  • Relevant Industries
  • Overview
  • Building Report Parameters around a Location
  • Example Parameters for this Guide:
  • Select an Input Method
  • CSV Upload Method
  • Snowflake Connection Method
  • API Connection Method
  • Guide to Building the Report
  • Example API Connection Report Template

Was this helpful?

  1. Getting Started
  2. Guides
  3. Tutorials

Using Event Data in Power BI

Learn how to connect PredictHQ data to Power BI using multiple methods, and build an example report.

PreviousImproving Demand Forecasting Models with Event FeaturesNextUsing Event Data in Tableau

Last updated 1 month ago

Was this helpful?

In today's data-driven landscape, leveraging powerful analytical tools is essential for making informed decisions and uncovering hidden insights. This step-by-step guide focuses on Power BI as an industry standard robust, user-friendly platform. Power BI is used here as an example of a reporting suite that enables users to integrate data from various sources, create interactive reports, and share insights across an organization, to leverage PredictHQ data for powerful insights.

Use Cases

, , , , ,

Relevant Industries

, Consumer Packaged Goods, , Leisure, Travel and Tourism, Marketing and Advertising, , Restaurants, , and Others

Overview

This tutorial covers how to connect PredictHQ data to Power BI via two sources, CSV upload and direct API connection using one of our APIs - the Events API.

The data used in this guide is based on a popular location, in our case San Francisco City as a whole. Please change the location from San Francisco to the location you want to look at.

Below are the main steps involved in this guide:

  1. Building Report Parameters around a Location

    • Example Parameters for this Guide

  2. Select an Input method

    • CSV upload

    • Snowflake Connection

    • API connection

  3. Guide to Building the Report

    • Example report download

Requirements:

  1. Access to PredictHQ data via 3 methods with 3 different requirements:

Building Report Parameters around a Location

For the purposes of this tutorial, parameters will be fixed for a standard example. Parameters are defined below, focusing on San Francisco city for attended events in a 3 month period.

Example Parameters for this Guide:

  1. Date: user-defined, this tutorial uses a 3-month period from January 1st to March 31st 2024

  2. Event State: Active and Predicted

  3. PHQ Attendance: attended events only - filtered to events with an attendance of at least 1

The report provided in this example shows a graph of the total number of people attending events around the location per day, as well as a list of the events happening at the location sorted by the highest attendance events first.

We find many customers want to know what is happening around a business location such as around a hotel, restaurant, store, or other location. The graph of total attendance per day shows you peaks and dips in physically attended events. This allows you to see upcoming busy days or potential demand surges as well as quieter days. The list of events allows you to see events happening on a given day in more detail.

The end result of the exercise will be a report like this:

Select an Input Method

There are several ways to connect PHQ data to Power BI or other reporting software. Below are three of the main methods users can utilize to connect and start creating reports.

CSV Upload Method

In Power BI, create a new report and press Get Data -> Text/CSV

Upload the CSV export and hit the transform data option.

Right-click the Query under Queries and go to the Advanced Editor option. The Query will be named the same as the uploaded CSV name.

This opens up a Power Query window which allows code to transform the data for us. Below is a Power Query code that will transform the columns automatically for use in the report.

Paste the Power Query below after the first existing 4 lines, after the "Changed Type" step, replacing everything from the existing “in” down.

    ,#"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"impact_patterns", "impact_patterns_raw"}}),
    #"Added Custom" = Table.AddColumn(#"Renamed Columns", "impact_patterns", each Json.Document([impact_patterns_raw])),
    #"Expanded impact_patterns" = Table.ExpandListColumn(#"Added Custom", "impact_patterns"),
    #"Expanded impact_patterns1" = Table.ExpandRecordColumn(#"Expanded impact_patterns", "impact_patterns", {"vertical", "impact_type", "impacts"}, {"impact_patterns.vertical", "impact_patterns.impact_type", "impact_patterns.impacts"}),
    #"Expanded impact_patterns.impacts" = Table.ExpandListColumn(#"Expanded impact_patterns1", "impact_patterns.impacts"),
    #"Expanded impact_patterns.impacts1" = Table.ExpandRecordColumn(#"Expanded impact_patterns.impacts", "impact_patterns.impacts", {"date_local", "value", "position"}, {"impact_patterns.impacts.date_local", "impact_patterns.impacts.value", "impact_patterns.impacts.position"}),
    #"Filtered Rows" = Table.SelectRows(#"Expanded impact_patterns.impacts1", each ([impact_patterns.vertical] = "accommodation" and [impact_patterns.impacts.position] = "event_day")),
    #"Changed Number Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"impact_patterns.impacts.value", Int64.Type}}),
    #"Changed Date Type" = Table.TransformColumnTypes(#"Changed Number Type", {
    {"impact_patterns.impacts.date_local", type date}
    }),
    #"Extracted Date" = Table.TransformColumns(#"Changed Date Type", {
        {"start", DateTime.Date, type date}, {"end", DateTime.Date, type date},
        {"start_local", DateTime.Date, type date}, {"end_local", DateTime.Date, type date}
    }),
    #"Changed Type1" = Table.TransformColumnTypes(#"Extracted Date",{{"phq_attendance", Int64.Type}}),
    #"Renamed Columns1" = Table.RenameColumns(#"Changed Type1", {
    {"impact_patterns.impacts.date_local", "date_local"},
    {"impact_patterns.impacts.value", "attendance_per_day"}
})
in
    #"Renamed Columns1"

As you can see we start with a comma to add on to the existing line, its positioning can be changed to the end of the existing line if you prefer, but its function is the same. The final pasted code should look something like this:

Hit Done. Hit Close & Apply and wait for the data transformation to finish processing.

Snowflake Connection Method

To connect using Snowflake you will need the following knowledge about your organization's Snowflake environment. Ask your Snowflake Administrators for these settings or refer to Snowflake's official documentation links for the variables below:

To start, navigate to the Snowflake data connection via:

Get Data -> More -> Database -> Snowflake\

The Server and Warehouse info we gathered above will need to be entered at this step. It should look something like the below, replacing square bracket placeholder variables for your Server and Warehouse info.

Expand Advanced options and scroll down.

Fill the Database where the PredictHQ Events table lies in your Snowflake structure (case sensitive). Paste this SQL in the SQL box after substituting the Schema and Table Name. This code assumes no columns have been renamed:

select e.event_id as id, e.parent_event_id, e.update_dt, e.title, e.category, ARRAY_TO_STRING(e.labels,',') as labels, e.phq_labels
    , e.phq_rank, e.phq_attendance, e.local_rank, e.status 
    , e.event_start AS "start", e.event_start_local as "start_local", e.event_end AS "end", e.event_end_local as "end_local"
    , e.predicted_end, e.timezone
    , val.value:value::INT as attendance_per_day, val.value:date_local::DATE as date_local  
    , e.country_code, e.entities, e.geo, e.placekey, e.impact_patterns
    , e.predicted_event_spend_accommodation, e.predicted_event_spend_hospitality, e.predicted_event_spend_transportation
    , e.place_hierarchies 
from [Schema].[TableName] e
, LATERAL FLATTEN(INPUT => impact_patterns) imp
, LATERAL FLATTEN(INPUT => imp.value:impacts) val
where val.value:date_local::DATE between '2024-01-01' and '2024-03-31'
    and status in ('active','predicted')
    and phq_attendance >= 1
    and category in ('community','concerts','conferences','expos','festivals','performing-arts','sports')
    and imp.value:vertical::STRING = 'accommodation' and val.value:position::STRING = 'event_day'
    and ARRAY_TO_STRING(PLACE_HIERARCHIES, ',') ilike '%5391959%'

This is what it should look like when filled in - with all square bracket placeholder text in the FROM condition replaced.

Click "OK". Click "Load Data" on the next screen.

Connection settings: DirectQuery is recommended for constant database connection. Import for one-off import of data from the database.

Connecting to other Data Warehouses

API Connection Method

https://api.predicthq.com/v1/events/?active.gte=2024-01-01&active.lt=2024-04-01&active.tz=America/Los_Angeles&category=community,conferences,concerts,expos,festivals,performing-arts,sports&state=active,predicted&phq_attendance.gte=1&place.scope=5391959&limit=500

With this API query string, event data can start to be loaded into Power BI.

First, start a new report. Then select Get Data -> Web

Choose the Advanced tab, not the Basic default. Because the PredictHQ API is Bearer token authorized, the Advanced tab must be selected to include the API Access Token request header.

Add the HTTP request header with the following information:

  1. URL parts: our created Events API URL from the above: https://api.predicthq.com/v1/events/?active.gte=2024-01-01&active.lt=2024-04-01&active.tz=America/Los_Angeles&category=community,conferences,concerts,expos,festivals,performing-arts,sports&state=active,predicted&phq_attendance.gte=1&place.scope=5391959&limit=500

  2. HTTP request header parameters:

    1. Put Authorization in the first field

    2. Put Bearer api_token in the field on the right of the first field with Authorization. where [api_token] will be replaced with your PHQ API Access Token. Just replace ‘[api_token]’ with your actual API Access Token. Leave the ‘Bearer ’ part in

The filled-out information should look like this:

After clicking “OK”, the Data Transformation page will open where data shaping options can be made before building the report.

Rename the Query to something relevant, as it defaults to the connection URL string parameters and we need a string to reference in the Power Query code below. We recommend the Query be renamed to “PredictHQ Connection”.

In order to transform the columns, open Power Query and paste the code below to format and expand some columns for easy use. To do this, go to the Advanced Editor for this Query, right click on the Query name under Queries and click Advanced Editor:

Replace the entire existing Power Query code with the one below, changing the 2 lines (Lines 4 and 8) that refer to ‘[api_token]’ with the PHQ API Access Token used previously.

If you renamed the Query to something other than "PredictHQ Connection" as per our steps above, you must also rename the reference in lines 2 and 11.

let
    #"PredictHQ Connection" = List.Generate( () =>
    [URL = "https://api.predicthq.com/v1/events/?active.gte=2024-01-01&active.lt=2024-04-01&active.tz=America/Los_Angeles&category=community,conferences,concerts,expos,festivals,performing-arts,sports&state=active,predicted&phq_attendance.gte=1&place.scope=5391959&limit=500",
     Result = Json.Document(Web.Contents(URL, [Headers=[Authorization="Bearer [api_token]"]]))],
    each [URL] <> null,
    each [
        URL = [Result][next],
        Result = Json.Document(Web.Contents(URL, [Headers=[Authorization="Bearer [api_token]"]]))
    ]
),
    #"Converted to Table" = Table.FromList(#"PredictHQ Connection", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"Result"}, {"Column1.Result"}),
    #"Expanded Column1.Result" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1.Result", {"results"}, {"Column1.Result.results"}),
    #"Expanded Column1.Result.results" = Table.ExpandListColumn(#"Expanded Column1.Result", "Column1.Result.results"),
    #"Expanded Column1.Result.results1" = Table.ExpandRecordColumn(#"Expanded Column1.Result.results", "Column1.Result.results", {"id", "title", "description", "category", "labels", "rank", "local_rank", "phq_attendance", "entities", "duration", "start", "start_local", "end", "end_local", "updated", "first_seen", "timezone", "location", "geo", "impact_patterns", "scope", "country", "place_hierarchies", "state", "private", "predicted_event_spend", "predicted_event_spend_industries", "phq_labels"}),
    #"Expanded impact_patterns" = Table.ExpandListColumn(#"Expanded Column1.Result.results1", "impact_patterns"),
    #"Expanded impact_patterns1" = Table.ExpandRecordColumn(#"Expanded impact_patterns", "impact_patterns", {"vertical", "impact_type", "impacts"}, {"impact_patterns.vertical", "impact_patterns.impact_type", "impact_patterns.impacts"}),
    #"Expanded impact_patterns.impacts" = Table.ExpandListColumn(#"Expanded impact_patterns1", "impact_patterns.impacts"),
    #"Expanded impact_patterns.impacts1" = Table.ExpandRecordColumn(#"Expanded impact_patterns.impacts", "impact_patterns.impacts", {"date_local", "value", "position"}, {"impact_patterns.impacts.date_local", "impact_patterns.impacts.value", "impact_patterns.impacts.position"}),
    #"Filtered Rows" = Table.SelectRows(#"Expanded impact_patterns.impacts1", each ([impact_patterns.vertical] = "accommodation" and [impact_patterns.impacts.position] = "event_day")),
    #"Changed Number Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"impact_patterns.impacts.value", Int64.Type}}),
    #"Changed Date Type" = Table.TransformColumnTypes(#"Changed Number Type", {
    {"impact_patterns.impacts.date_local", type date},
    {"start", type datetime}, {"end", type datetime},
    {"start_local", type datetime}, {"end_local", type datetime}
    }),
    #"Extracted Date" = Table.TransformColumns(#"Changed Date Type", {
        {"start", DateTime.Date, type date}, {"end", DateTime.Date, type date},
        {"start_local", DateTime.Date, type date}, {"end_local", DateTime.Date, type date}
    }),
    #"Changed Type" = Table.TransformColumnTypes(#"Extracted Date",{{"phq_attendance", Int64.Type}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type", {
    {"impact_patterns.impacts.date_local", "date_local"},
    {"impact_patterns.impacts.value", "attendance_per_day"}
    })
in
    #"Renamed Columns"

Click Close & Apply and wait for the data transformation to finish processing through multiple API pages.

Guide to Building the Report

Using either of the two methods above will get PredictHQ Events data loaded and transformed in the same format ready to be used in a report. Not all the columns were transformed, just the ones used in this guide.

This guide creates a connected chart and table that covers the defined time period and shows the attendance per day in the chosen location - in the example San Francisco city as a whole. The chart breaks up attendance per day for the visualization, but the table shows event details and attendance in full, not split by day. Date results are shown in UTC, use the "_local" date columns for the local date.

To begin, insert a blank chart and table visualization using the Insert -> New Visual tab options, with the chart on top taking up half the screen, and the table on the bottom filling the other half.

Group as one (shift-click both boxes, right-click on one of them, and click the Group -> Group option).

Before the next step of filling in the chart and table, add Filters for the page: drag the 'date_local' field from the Data tab on the right to the “Filters on this page” section under Filters.

Change the drop-down to Advanced filtering and add the following:

“Is on or after” start of the selected date range AND “is before” the day after the date range ends - click “apply filter” in the bottom right of the filter menu. In the example, those dates are anything on or after the 1st of January 2024 and anything before 1st of April 2024.

Now fill the chart axis. The X-axis gets filled with the date_local field The Y-axis gets filled with the attendance_per_day field (this should default to a SUM which is correct)

For the table, drag these fields over and resize the columns as needed to fit everything:

Table: id, title, category, phq_attendance, start_local, end_local

For all fields that involve a date (date_local, start_local, end_local), remove the default Date Hierarchy format to get the actual date showing. Use the dropdown in the Visualizations column and select the field name instead of “Date Hierarchy”. If Date Hierarchy is preferred, feel free to leave this as is.

For phq_attendance in the table use the drop down to remove the summary, this summary isn’t actually grouping anything so is an unnecessary default that should be removed. Note that we only want to stop the summarization in the table, leave the chart as is.

Rename the chart title by clicking on the chart and going to the Visualizations tab -> General -> Title. Let’s rename it to “Event Attendance per day in San Francisco”.

Click on the "phq_attendance" column in the table twice to sort by highest to lowest attendance.

The final result should look like the following:

The picture above shows how this analysis can be used; by clicking on a spike (or any period on the chart) the report shows the events active during that period. The table data does not show the attendance per day like the chart, but the overall attendance of the event's full duration.

Example API Connection Report Template

Below is a downloadable Power BI template that will automatically create the example report used throughout this guide, using the API Connection method.

Upon opening the template, you will be prompted to enter an API Access Token. Inputting this token will enable the report to automatically populate and build according to the parameters set forth in this guide. Please wait 10-20 seconds between each step as data populates and data runs in the background.

Once the data connection has loaded for a bit you might be prompted for a connection method screen like below. Please select Anonymous and click Connect.

Example Report:

CSV: PredictHQ account - if you don’t already have an account.

Snowflake: PredictHQ

API:

reporting software

All of our parameters are able to be modified based on user needs, see our for details on what these parameters mean and how they can be modified to suit different use cases.

Categories: community, conferences, concerts, expos, festivals, performing-arts, sports - these are our

Location: San Francisco city (place ID )

Location could be substituted for a specific latitude and longitude relating to an individual store, or could be scoped even wider depending on need. We suggest utilizing our to hone in on a specific shop location and pull only events within a more accurate radius based on those results. For now, we will look at the citywide events in San Francisco as our example.

Our customers use this in a variety of ways, for example, an accommodation customer may use a report like this to set their hotel room pricing per day and may increase the price on days with a lot of events happening. A restaurant customer looking at staffing might roster more people when they see a lot of events happening near their location and perhaps reduce staff levels when fewer events are happening. And so on. See our for more examples.

: The quick and easy way to connect data straight from our PredictHQ WebApp into reporting software. If a static view of data is all you need, this method gets it done fast. This method does not refresh or update the data when it changes. Events are dynamic and get canceled, postponed, move location, and so on. Using a CSV is a good way to do initial modeling but we’d suggest calling the API or connecting to a data warehouse moving forward.

: Choosing Snowflake as the data source for Power BI is highly recommended due to its robust data warehousing capabilities and seamless integration. Snowflake provides dynamic scalability and real-time data access, enhancing the accuracy and efficiency of reports. Snowflake offers straightforward connectivity and powerful query performance.

: Another preferred method for connecting our dynamic events data to Business Intelligence software is to use our robust APIs. This way the report is connected to an ever-updating data source and is always up to date.

We will use PredictHQ to get our CSV. Filter the events based on the parameters laid out in the . For more information on using the WebApp Search use . Fill in the parameters and hit search.

Once the search has completed hit the Export button on the right to get a CSV. For more details on exporting see the . Once the export has been downloaded, it’s ready for use in Power BI. The filename by default should be “Events-Export-zzzz-on-xxxx” where x is the date of the export and z is the location - feel free to rename this to anything else.

This code expands out the 'impact_patterns' column (see in our technical documentation for more information) and filters it to accommodation and actual attendance distribution. It renames some essential columns. It also transforms some column formats for easier use in reporting. It is an involved process with multiple steps - the Power Query below is the final output of this multi-stage transformation.

After completing these steps, we have successfully loaded a CSV extract of PHQ Events data into Power BI ready for use in visuals and reporting. step below for the next steps.

Server Name: Usually in format <>.snowflakecomputing.com

Warehouse: A is what you use to run queries. See which are available to you using the function.

Table Location: the Database, Schema, and Table Name for your table.

This code is performing the data transformation and filtering in code. It filters to the parameters laid out in the section, and transforms some columns we will be using for ease of use in the report. The most important transformed column is the 'impact_patterns' column which we use to find the attendance spread per day across a multi-day event. See in our technical documentation for more information.

After completing these steps, we have successfully connected Events data from Snowflake into Power BI ready for use in visuals and reporting and automatic data refreshes. step below for the next steps.

See Loading Event Data into a Data Warehouse for an example of how to load event data into Google BigQuery or other data warehouses. See this guide on .

PredictHQ has a few APIs that can be used to build reports, for this example, we will stick to the Events API. Starting this process assumes a PredictHQ API access token has been created by following the .

Power BI will connect using the URL from the : but, query parameters must be added to this URL for the Power BI connection, in line with the parameters outlined in the .

Following these parameters and the documentation we will end up with a URL string like the one below:

Note: Scope uses the Place ID (geonames ID) for San Francisco (see our ). If you were looking for events happening around a business location you would use the with the latitude and longitude of your business location and the radius from the suggested radius API. Time zone parameter (active.tz) filters results based on that given time zone. Limit parameter allows for more results returned per “page” which allows for faster loading, rather than the default 10 per page.

See also our for details on how to query the Events API for events impacting your locations.

This code expands out the 'impact_patterns' column (see in our technical documentation for more information) and filters it to accommodation and actual attendance distribution. It renames some essential columns. It also accounts for our API pagination, making sure all results are returned. It is an involved process with multiple steps - the Power Query below is the final output of this multi-stage transformation.

After this step the data is now ready to start building a report with, as it has been successfully loaded and transformed in Power BI. A template of this API Connection report pre-built is available at the end in the section.

A useful addition to this basic view could be a drill down on the table by adding a new table visual to the group that has the 'id', 'date_local', and 'attendance_per_day' columns, showing how the attendance of an event has been spread out over multiple days (if it is a multi-day event). For more understanding of multi-day events, see our documentation.

Customers can add their own data to this chart to compare peaks and troughs of attendance vs sales in a basic comparison report. For deeper analysis into these kinds of reports, we suggest using our functionality to provide a deeper insight as to which types of events impact demand, as the Events API will only give a high-level view of the story without any additional analysis from PredictHQ to provide more in-depth information.

If there are any issues with this template please refer to the and ensure all settings match with those steps.

Demand Forecasting
Dynamic Pricing
Workforce Optimization
Demand Analytics
Inventory Management
Event Visibility
Accommodation
Grocery and Supermarkets
Parking
Retail
Transportation and Delivery
Sign up here
Snowflake Data Share
API Access Token
Microsoft Power BI
filtering guide
attended categories
5391959
Suggested Radius API
use case guide
CSV Export Guide
Impact Patterns
account_name
warehouse
SHOW WAREHOUSES
PredictHQ Data Share Events
how to connect PowerBI to Google BigQuery
API Quickstart guide
Events API
tech docs for info on Place ID
within parameter
filtering guide
Impact Patterns
Working with Multi-day Events
Beam
CSV Upload
Snowflake Connection
API Connection
WebApp Search
this guide
Example Parameters for this Guide
See the Building the Report
Impact Patterns
Example Parameters for this Guide
See the Building the Report
Events API
https://api.predicthq.com/v1/events/
Example Parameters for this Guide
Example API Connection Report Template
API Connection Method
18KB
PredictHQ API Connection Example Report.pbit
Final Report Result
WebApp Example Filters
Get Data -> Text/CSV new connection
CSV 'Transform Data'
right click Query -> Advanced Editor
CSV Power Query
CSV Close & Apply
Get Data -> More
Database -> Snowflake
enter Server and Warehouse info
Get Data -> Web connection
Web Connection URL and Header
Rename the Query
Right click renamed Query -> Advanced Editor
API Close & Apply
Blank chart and table grouped
date_local Filter on page
Remove Date Hierarchy
Remove Summarization from the Table
Chart Title Rename
Final Report Result
Fill PredictHQ API Access Token in the report when prompted
Since the PredictHQ API Access Token has already been entered, select Anonymous here