Transforming Event Data into ML-Ready Features using SQL
Using Snowflake SQL to recreate Features-API
This guide uses a publicly available PredictHQ event sample table called
PREDICTHQ_EVENTS_RETAIL_LONDON
Please change this table name in all instances below with the name of the events data table that has been provisioned by PredictHQ as per the Snowflake Secure Data Share.
The rest of the guide also uses temporary tables but these tables can be turned into permanent tables as needed.
Once SAVED_LOCATIONS has been created as per the parent page of this guide, the following steps are required and blocked out:
Modify the input input table format to use with the code in this guide
Generate daily aggregated statistics for each location by…
attendance based features
rank based features
impact based features
The date range in the following code examples should be updated based on the desired granularity:
For training a machine learning model, update the dates to get historical data for the locations
If running a model in production and forecasting future demand, update the dates for the visibility window of the forecast - e.g. the new week, month, or months
Join all features together in a single output table
Step 1: Modify the input table format
Once the SAVED_LOCATIONS input table is created, the below code shapes that table to be in a day by day format of the input called SAVED_LOCATIONS_DAILY:
----split the table out into having one day equals one row between the date range.CREATEORREPLACE TEMP TABLE saved_locations_daily ASselectdate(date_start) +value::intasdate, s.location, s.lat, s.lon, s.radius,lower(s.radius_unit) as radius_unit --forcing the lower in case of data entry mistakesfrom saved_locations s,table(flatten(array_generate_range(0, datediff('day', date_start, date_end) +1))) t;
Step 2: Calculating Daily Aggregated ML Features with SQL in Snowflake
Each Feature set will be calculated in blocks, see the column headers in each code block below for which Features are available to be generated.
PHQ Attendance Features
Values are calculated as the sum of predicted attendance for the day at a given location within the defined radius.
PHQ Attended Features
----PHQ Attendance FeaturesCREATEORREPLACE TEMP TABLE phq_attendance_features ASWITH events_attended AS ( --Attendance Features for main 7 categoriesSELECT e.event_id, s.date, s.location, e.category, imp.value:value::intas phq_attendanceFROM predicthq.predicthq_events_retail_london eRIGHT JOIN saved_locations_daily sON ST_DISTANCE(e.geo, ST_MAKEPOINT(s.lon, s.lat)) <=CASEWHEN s.radius_unit ='mi'THEN s.radius *1609.34WHEN s.radius_unit ='km'THEN s.radius *1000END AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_start)) <= s.date
AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_end)) >= s.date,
latERAL FlatTEN(INPUT => e.IMPACT_PATTERNS) vert, latERAL FlatTEN(INPUT => vert.value:impacts) impWHERE imp.value:date_local::DATE= s.dateAND vert.value:vertical::STRING ='accommodation'AND imp.value:position::STRING ='event_day'AND e.phq_attendance IS NOT NULLAND e.category in ('community','concerts','conferences','expos','festivals','performing-arts','sports')),events_attended_other AS ( --Attendance Features for special categoriesSELECT e.event_id, s.date, s.location, e.category, e.phq_attendance,CASEWHEN category ='academic'and ARRAY_CONTAINS('social'::variant, e.labels) THEN'social'WHEN category ='academic'and ARRAY_CONTAINS('social'::variant, e.labels) THEN'graduation'ELSE''ENDas academic_splitFROM predicthq.predicthq_events_retail_london eRIGHT JOIN saved_locations_daily sON ST_DISTANCE(e.geo, ST_MAKEPOINT(s.lon, s.lat)) <=CASEWHEN s.radius_unit ='mi'THEN s.radius *1609.34WHEN s.radius_unit ='km'THEN s.radius *1000END AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_start)) <= s.date
AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_end)) >= s.date
WHERE e.phq_attendance IS NOT NULLAND e.category in ('school-holidays','academic')),attendance_group AS ( --Group the 7 main categories daily and make sure a result is displayed each day even if it's just 0
SELECT data_range.date, data_range.location,SUM(CASEWHEN a.category ='community'THEN a.phq_attendance ELSE0END) AS phq_attendance_community,SUM(CASEWHEN a.category ='concerts'THEN a.phq_attendance ELSE0END) AS phq_attendance_concerts,SUM(CASEWHEN a.category ='conferences'THEN a.phq_attendance ELSE0END) AS phq_attendance_conferences,SUM(CASEWHEN a.category ='expos'THEN a.phq_attendance ELSE0END) AS phq_attendance_expos,SUM(CASEWHEN a.category ='festivals'THEN a.phq_attendance ELSE0END) AS phq_attendance_festivals,SUM(CASEWHEN a.category ='performing-arts'THEN a.phq_attendance ELSE0END) AS phq_attendance_performing_arts,SUM(CASEWHEN a.category ='sports'THEN a.phq_attendance ELSE0END) AS phq_attendance_sportsFROM (SELECTdate, locationFROM saved_locations_daily) data_rangeLEFT JOIN events_attended aON data_range.date = a.dateAND data_range.location = a.locationGROUP BY data_range.date, data_range.locationORDER BY data_range.date, data_range.location),attendance_group_other AS ( --Group the other categories daily and make sure a result is displayed each day even if it's just 0
SELECT data_range.date, data_range.location, SUM(CASE WHEN ao.category = 'school-holidays' THEN ao.phq_attendance ELSE 0 END) AS phq_attendance_school_holidays,
SUM(CASE WHEN ao.academic_split = 'graduation' THEN ao.phq_attendance ELSE 0 END) AS phq_attendance_academic_graduation,
SUM(CASEWHEN ao.academic_split ='social'THEN ao.phq_attendance ELSE0END) AS phq_attendance_academic_socialFROM (SELECTdate, locationFROM saved_locations_daily) data_rangeLEFT JOIN events_attended_other aoON data_range.date = ao.dateAND data_range.location = ao.locationGROUP BY data_range.date, data_range.locationORDER BY data_range.date, data_range.location)SELECT--final select for phq_attendance_features ag.date, ag.location, ag.phq_attendance_community, ag.phq_attendance_concerts, ag.phq_attendance_conferences, ag.phq_attendance_expos, ag.phq_attendance_festivals, ag.phq_attendance_performing_arts, ag.phq_attendance_sports, ago.phq_attendance_school_holidays, ago.phq_attendance_academic_graduation, ago.phq_attendance_academic_social FROM attendance_group agLEFT JOIN attendance_group_other agoON ag.location = ago.locationAND ag.date = ago.date;SELECT*FROM phq_attendance_features order bylocation, date;
If metrics other than SUM are desired, use the below code as a template for each column. The category name part of the code for each column (in these examples defaulted to ‘community’) will need to be replaced depending on which PHQ Attendance Feature is intended to be called. Refer to the column code above for available Feature categories.
Values are calculated as a count of events occurring at each rank level, per day, per location. If an event occurs over multiple days, it will have a result in each day until the event is over. Each rank level is returned as its own column.
PHQ Rank Features
----PHQ Rank FeaturesCREATEORREPLACE TEMP TABLE phq_rank_features asWITH events_ranked AS ( --Pull ranked events within rangeSELECT e.event_id, s.date, s.location, e.category, e.phq_rank,CASEWHEN category ='academic'and ARRAY_CONTAINS('academic-session'::variant, e.labels) THEN'session'WHEN category ='academic'and ARRAY_CONTAINS('exam'::variant, e.labels) THEN'exam'WHEN category ='academic'and ARRAY_CONTAINS('holiday'::variant, e.labels) THEN'holiday'ELSE''ENDAS academic_split,CASEWHEN e.phq_rank between0and20THEN1WHEN e.phq_rank between21and40THEN2WHEN e.phq_rank between41and60THEN3WHEN e.phq_rank between61and80THEN4ELSE5ENDas rank_levelFROM predicthq.predicthq_events_retail_london eRIGHT JOIN saved_locations_daily sON ST_DISTANCE(e.geo, ST_MAKEPOINT(s.lon, s.lat)) <=CASEWHEN s.radius_unit ='mi'THEN s.radius *1609.34WHEN s.radius_unit ='km'THEN s.radius *1000END AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_start)) <= s.date
AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_end)) >= s.date
WHERE e.phq_rank IS NOT NULLAND e.category in ('academic','public-holidays','school-holidays','observances') ), events_ranked_distinct AS ( --Get count per rank levelSELECT r.date, r.location, r.category, r.academic_split, r.rank_level,count(DISTINCT r.event_id) AS distinct_event_countFROM events_ranked rGROUP BY r.date, r.location, r.category, r.academic_split, r.rank_level)SELECT--Final formatting and select for phq_rank_features data_range.date, data_range.location, SUM(CASE WHEN r.category = 'observances' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_observances_rank_level1,
SUM(CASE WHEN r.category = 'observances' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_observances_rank_level2,
SUM(CASE WHEN r.category = 'observances' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_observances_rank_level3,
SUM(CASE WHEN r.category = 'observances' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_observances_rank_level4,
SUM(CASE WHEN r.category = 'observances' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_observances_rank_level5,
SUM(CASE WHEN r.category = 'public-holidays' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_public_holidays_rank_level1,
SUM(CASE WHEN r.category = 'public-holidays' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_public_holidays_rank_level2,
SUM(CASE WHEN r.category = 'public-holidays' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_public_holidays_rank_level3,
SUM(CASE WHEN r.category = 'public-holidays' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_public_holidays_rank_level4,
SUM(CASE WHEN r.category = 'public-holidays' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_public_holidays_rank_level5,
SUM(CASE WHEN r.category = 'school-holidays' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_school_holidays_rank_level1,
SUM(CASE WHEN r.category = 'school-holidays' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_school_holidays_rank_level2,
SUM(CASE WHEN r.category = 'school-holidays' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_school_holidays_rank_level3,
SUM(CASE WHEN r.category = 'school-holidays' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_school_holidays_rank_level4,
SUM(CASE WHEN r.category = 'school-holidays' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_school_holidays_rank_level5,
SUM(CASE WHEN r.academic_split = 'session' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_session_rank_level1,
SUM(CASE WHEN r.academic_split = 'session' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_session_rank_level2,
SUM(CASE WHEN r.academic_split = 'session' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_session_rank_level3,
SUM(CASE WHEN r.academic_split = 'session' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_session_rank_level4,
SUM(CASE WHEN r.academic_split = 'session' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_session_rank_level5,
SUM(CASE WHEN r.academic_split = 'exam' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_exam_rank_level1,
SUM(CASE WHEN r.academic_split = 'exam' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_exam_rank_level2,
SUM(CASE WHEN r.academic_split = 'exam' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_exam_rank_level3,
SUM(CASE WHEN r.academic_split = 'exam' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_exam_rank_level4,
SUM(CASE WHEN r.academic_split = 'exam' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_exam_rank_level5,
SUM(CASE WHEN r.academic_split = 'holiday' AND r.rank_level = 1 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_holiday_rank_level1,
SUM(CASE WHEN r.academic_split = 'holiday' AND r.rank_level = 2 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_holiday_rank_level2,
SUM(CASE WHEN r.academic_split = 'holiday' AND r.rank_level = 3 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_holiday_rank_level3,
SUM(CASE WHEN r.academic_split = 'holiday' AND r.rank_level = 4 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_holiday_rank_level4,
SUM(CASE WHEN r.academic_split = 'holiday' AND r.rank_level = 5 THEN r.distinct_event_count ELSE 0 END) AS phq_rank_academic_holiday_rank_level5
FROM (SELECTdate, locationFROM saved_locations_daily) data_rangeLEFT JOIN events_ranked_distinct rON data_range.date = r.dateAND data_range.location = r.locationGROUP BY data_range.date, data_range.locationORDER BY data_range.date, data_range.location;SELECT*FROM phq_rank_features order bylocation, date;
PHQ Impact Features
Values are calculated as MAX of the Ranks of events occurring over each day, showing the highest rank Severe Weather event of each type occurring per day.
PHQ Impact Features
----PHQ Impact FeaturesCREATEORREPLACE TEMP TABLE phq_impact_features asWITH events_impact AS ( --Pull impact events within rangeSELECT DISTINCT e.event_id, imp.value:date_local::DATEASdate, s.location, e.category, imp.value:value::intAS phq_rank,CASEWHEN ARRAY_CONTAINS('blizzard'::variant, e.labels) THEN'blizzard'WHEN ARRAY_CONTAINS('snow'::variant, e.labels) THEN'cold-wave-snow' WHEN ARRAY_CONTAINS('cold-wave'::variant, e.labels) AND ARRAY_CONTAINS('storm'::variant, e.labels) THEN 'cold-wave-storm'
WHEN ARRAY_CONTAINS('cold-wave'::variant, e.labels) THEN'cold-wave' WHEN ARRAY_CONTAINS('air-quality'::variant, e.labels) OR ARRAY_CONTAINS('fog'::variant, e.labels) OR ARRAY_CONTAINS('sand'::variant, e.labels) THEN 'air-quality'
WHEN ARRAY_CONTAINS('thunderstorm'::variant, e.labels) THEN'thunderstorm'WHEN ARRAY_CONTAINS('tropical-storm'::variant, e.labels) THEN'tropical-storm'WHEN ARRAY_CONTAINS('tornado'::variant, e.labels) THEN'tornado' WHEN ARRAY_CONTAINS('hurricane'::variant, e.labels) OR ARRAY_CONTAINS('cyclone'::variant, e.labels) OR ARRAY_CONTAINS('typhoon'::variant, e.labels) THEN 'hurricane'
WHEN ARRAY_CONTAINS('dust'::variant, e.labels) AND ARRAY_CONTAINS('storm'::variant, e.labels) THEN'dust-storm'WHEN ARRAY_CONTAINS('dust'::variant, e.labels) THEN'dust' WHEN ARRAY_CONTAINS('rain'::variant, e.labels) OR ARRAY_CONTAINS('flood'::variant, e.labels) OR ARRAY_CONTAINS('rain'::variant, e.labels) THEN 'flood'
WHEN ARRAY_CONTAINS('heat-wave'::variant, e.labels) THEN'heat-wave' WHEN ARRAY_CONTAINS('wind'::variant, e.labels) OR ARRAY_CONTAINS('hazardous-surf'::variant, e.labels) OR ARRAY_CONTAINS('storm'::variant, e.labels) THEN 'dust-storm'
ENDAS weather_categoryFROM predicthq.events_0 eRIGHT JOIN saved_locations_daily sON ST_DISTANCE(e.geo, ST_MAKEPOINT(s.lon, s.lat)) <=CASEWHEN s.radius_unit ='mi'THEN s.radius *1609.34WHEN s.radius_unit ='km'THEN s.radius *1000END AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_start)) <= s.date
AND TO_DATE(CONVERT_TIMEZONE(CASE WHEN e.timezone IS NOT NULL THEN e.timezone ELSE 'UTC' END, e.event_end)) >= s.date,
latERAL FlatTEN(INPUT => IMPACT_PATTERNS) vert, latERAL FlatTEN(INPUT => vert.value:impacts) impWHERE vert.value:vertical::STRING ='retail'AND e.category ='severe-weather'AND weather_category IS NOT NULL )SELECT--final formatting and select for phq_impact_features data_range.date, data_range.location, IFNULL(MAX(CASE WHEN i.weather_category = 'air-quality' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_air_quality_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'blizzard' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_blizzard_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'cold-wave' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_cold_wave_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'cold-wave-snow' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_cold_wave_snow_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'cold-wave-storm' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_cold_wave_storm_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'dust' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_dust_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'dust-storm' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_dust_storm_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'flood' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_flood_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'heat-wave' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_heat_wave_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'hurricane' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_hurricane_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'thunderstorm' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_thunderstorm_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'tornado' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_tornado_retail,
IFNULL(MAX(CASE WHEN i.weather_category = 'tropical-storm' THEN i.phq_rank ELSE NULL END),0) AS phq_impact_severe_weather_tropical_storm_retail
FROM (SELECTdate, locationFROM saved_locations_daily) data_rangeLEFT JOIN events_impact iON data_range.date = i.dateAND data_range.location = i.locationGROUP BY data_range.date, data_range.locationORDER BY data_range.date, data_range.location;SELECT*FROM phq_impact_features order bylocation, date;
If metrics other than MAX are desired, use the below code as a template for each column. The weather_category name part of the code (in these examples defaulted to ‘air-quality’) will need to be replaced depending on which feature is intended to be called. Refer to the column code above for the available weather_category features.
The following code will pull features generated above all into a single table called ML_FEATURES_FOR_LOCATIONS.
This output is intended to be used directly by Machine Learning models. If unsure what features to use, it is recommended to create a Beam analysis for the locations and leverage the category importance results with the “View ML Features” option (see here).