Performance & Date Conversion Issues with Complex Custom SQL for Vehicle Data

Hi Quick Sight Community,

I’m working on a custom SQL query in Oracle to build a dataset for vehicle tracking metrics. The goal is to adjust UTC timestamps to local time (using a parameter), calculate driving, stop, and idle durations per day, and ultimately return only the latest record per vehicle per day—all limited to a rolling 30-day window.

Here’s a high-level overview of what my query does:

  • Time Zone Conversion:
    I use a parameter (${SelectTimezone}) to adjust the raw timestamp from UTC into local time (EST, CST, MST, or PST).
  • Calculating Intervals:
    I use window functions (LAG) to calculate the time difference between consecutive records and classify each interval as “driving” (speed > 0), “stopped” (speed = 0), or “idle” (stopped but engine is “Running”).
  • Daily Aggregation:
    I aggregate these intervals by day (using TRUNC on the adjusted timestamp).
  • Latest Record per Day:
    I use ROW_NUMBER() to return only the latest record per vehicle per day.
  • Duration Formatting:
    I convert the numeric totals (in minutes) into a formatted string (e.g., “1d 03h:15m”).

WITH raw_data AS (

– Retrieve raw rows and convert each REPORT_TIME to local time based on
– the parameter ‘${SetTimezone}’.

SELECT
l.VIN AS vin,
l.REPORT_TIME AS utc_dt,
CASE
WHEN ‘${SelectTimezone}’ = ‘US/Eastern’ THEN
FROM_TZ(CAST(l.REPORT_TIME AS TIMESTAMP), ‘UTC’) AT TIME ZONE ‘US/Eastern’
WHEN ‘${SelectTimezone}’ = ‘US/Central’ THEN
FROM_TZ(CAST(l.REPORT_TIME AS TIMESTAMP), ‘UTC’) AT TIME ZONE ‘US/Central’
WHEN ‘${SelectTimezone}’ = ‘US/Mountain’ THEN
FROM_TZ(CAST(l.REPORT_TIME AS TIMESTAMP), ‘UTC’) AT TIME ZONE ‘US/Mountain’
WHEN ‘${SelectTimezone}’ = ‘US/Pacific’ THEN
FROM_TZ(CAST(l.REPORT_TIME AS TIMESTAMP), ‘UTC’) AT TIME ZONE ‘US/Pacific’
ELSE
FROM_TZ(CAST(l.REPORT_TIME AS TIMESTAMP), ‘UTC’)
END AS local_dt,

l.ODOMETER_READING       AS odometer,
l.VEHICLE_SPEED          AS speed,
l.LATITUDE               AS latitude,
l.LONGITUDE              AS longitude,
l.CITY                   AS city,
l.STATE                  AS state,
l.FUEL_LEVEL             AS fuel_level,
l.ENGINE_STATUS          AS engine_status,
l.IDLE_ENGINE_HOURS      AS idle_engine_hours,
l.SEATBELT_BUCKLED       AS seatbelt_buckled,
CASE 
  WHEN l.VEHICLE_SPEED > 0 THEN 'Driving'
  ELSE 'Stopped'
END AS row_status

FROM TDV.VEH_LOCATION_EMOB l
WHERE
l.FUEL_LEVEL > 0
AND l.REPORT_TIME >= CURRENT_DATE - INTERVAL ‘30’ DAY
),
ordered_rows AS (

– Order the data by the converted local_dt and capture the previous row values.

SELECT
r.vin,
r.local_dt,
r.odometer,
r.speed,
r.latitude,
r.longitude,
r.city,
r.state,
r.fuel_level,
r.engine_status,
r.idle_engine_hours,
r.seatbelt_buckled,
r.row_status,
LAG(r.local_dt) OVER (PARTITION BY r.vin ORDER BY r.local_dt) AS prev_local_dt,
LAG(r.odometer) OVER (PARTITION BY r.vin ORDER BY r.local_dt) AS prev_odometer,
LAG(r.row_status) OVER (PARTITION BY r.vin ORDER BY r.local_dt) AS prev_status,
LAG(r.idle_engine_hours) OVER (PARTITION BY r.vin ORDER BY r.local_dt) AS prev_idle_engine_hours,
LAG(r.engine_status) OVER (PARTITION BY r.vin ORDER BY r.local_dt) AS prev_engine_status
FROM raw_data r
),
rowwise_intervals AS (

– Compute time intervals and distances on a row-by-row basis.

SELECT
o.vin,
TRUNC(o.prev_local_dt) AS local_day, – Now groups by the converted local date
o.local_dt,
o.odometer,
o.speed,
o.row_status,
ROUND((CAST(o.local_dt AS DATE) - CAST(o.prev_local_dt AS DATE)) * 24 * 60) AS interval_min,
CASE WHEN o.prev_status = ‘Driving’ THEN
ROUND((CAST(o.local_dt AS DATE) - CAST(o.prev_local_dt AS DATE)) * 24 * 60)
ELSE 0 END AS driving_min,
CASE WHEN o.prev_status = ‘Stopped’ THEN
ROUND((CAST(o.local_dt AS DATE) - CAST(o.prev_local_dt AS DATE)) * 24 * 60)
ELSE 0 END AS stop_min,
CASE
WHEN o.prev_status = ‘Stopped’ AND o.prev_engine_status = 1 THEN
ROUND((CAST(o.local_dt AS DATE) - CAST(o.prev_local_dt AS DATE)) * 24 * 60)
ELSE 0
END AS idle_min_old,
CASE
WHEN o.prev_idle_engine_hours IS NOT NULL
AND o.idle_engine_hours >= o.prev_idle_engine_hours
THEN ROUND((o.idle_engine_hours - o.prev_idle_engine_hours) * 60)
ELSE 0
END AS idle_min_new,
CASE WHEN o.prev_status = ‘Driving’ THEN (o.odometer - o.prev_odometer)
ELSE 0
END AS interval_distance,
o.speed AS interval_speed
FROM ordered_rows o
WHERE o.prev_local_dt IS NOT NULL
),
daily_aggregates AS (

– Aggregate intervals by day (using the converted local_day).

SELECT
r.vin,
r.local_day,
SUM(r.driving_min) AS total_driving_min,
SUM(r.stop_min) AS total_stop_min,
SUM(r.idle_min_old) AS total_idle_min_old,
SUM(r.idle_min_new) AS total_idle_min_new,
SUM(r.interval_distance) AS total_distance,
MAX(r.interval_speed) AS day_max_speed
FROM rowwise_intervals r
WHERE r.local_day IS NOT NULL
GROUP BY r.vin, r.local_day
),
daily_fuel AS (

– Aggregate fuel-level information by day.

SELECT
d.vin,
TRUNC(d.local_dt) AS local_day,
MAX(d.fuel_level) KEEP (DENSE_RANK FIRST ORDER BY d.local_dt) AS start_of_day_soc,
MAX(d.fuel_level) KEEP (DENSE_RANK LAST ORDER BY d.local_dt) AS end_of_day_soc,
MIN(d.fuel_level) AS min_soc
FROM raw_data d
GROUP BY d.vin, TRUNC(d.local_dt)
),
joined_daily AS (

– Join daily aggregates with daily fuel data.

SELECT
a.vin,
a.local_day,
a.total_driving_min,
a.total_stop_min,
a.total_idle_min_old,
a.total_idle_min_new,
a.total_distance,
a.day_max_speed,
f.start_of_day_soc,
f.end_of_day_soc,
f.min_soc
FROM daily_aggregates a
JOIN daily_fuel f
ON a.vin = f.vin AND a.local_day = f.local_day
),
LatestLocation AS (

– Capture location info from the latest record of each day.

SELECT
vin,
TRUNC(local_dt) AS local_day,
MAX(local_dt) AS latest_local_dt
FROM raw_data
GROUP BY vin, TRUNC(local_dt)
),
LatestLocationDetails AS (

– Retrieve detailed location info based on the latest record.

SELECT
r.vin,
TRUNC(r.local_dt) AS local_day,
r.latitude,
r.longitude,
r.city,
r.state,
r.local_dt
FROM raw_data r
JOIN LatestLocation l
ON r.vin = l.vin AND TRUNC(r.local_dt) = l.local_day AND r.local_dt = l.latest_local_dt
),
final_data AS (

– Combine daily aggregates with location details.

SELECT
jd.*,
ld.latitude,
ld.longitude,
ld.city,
ld.state
FROM joined_daily jd
LEFT JOIN LatestLocationDetails ld
ON jd.vin = ld.vin AND jd.local_day = ld.local_day
)


– Final SELECT: Return aggregated data using local day boundaries.

SELECT
final_data.vin,
final_data.local_day,
final_data.latitude,
final_data.longitude,
final_data.city,
final_data.state,
final_data.total_driving_min,
final_data.total_stop_min,
final_data.total_idle_min_old,
final_data.total_idle_min_new,
final_data.total_distance,
final_data.day_max_speed,
final_data.start_of_day_soc,
final_data.end_of_day_soc,
final_data.min_soc
FROM final_data
The Issue:
When I run this query in Quick Sight as a custom SQL dataset, the preview takes an exceedingly long time to load. Additionally, I have encountered issues with data type conversions—Quick Sight sometimes interprets the adjusted timestamps as strings instead of dates, which complicates further calculated fields on the front end.

I have also tried replicating some of this logic (time zone conversion, date truncation, interval calculations) as Quick Sight calculated fields, but then I run into issues with date parsing and performance.

My Questions:

  1. Has anyone encountered similar performance or date conversion issues with complex custom SQL in Quick Sight?
  2. Are there recommended strategies for optimizing such queries when using window functions and multiple CTEs in Quick Sight?
  3. Would it be better to push more of the logic to the Quick Sight front end (using calculated fields and table calculations) rather than handling everything in custom SQL?
  4. Any advice on ensuring Quick Sight properly recognizes date/datetime fields when converting from TIMESTAMP WITH TIME ZONE?

Any insights, optimization tips, or best practices you can share would be greatly appreciated!

Thank you in advance for your help.

Hi @olusegun83 -
Thank you so much for the details. Please find answers from my side, however we can get some experts advise as well.

Has anyone encountered similar performance or date conversion issues with complex custom SQL in Quick Sight?
[ Sanjeeb:] When you are using complex sql ( or custom sql) in Quick Sight, essentially the sql is going to run in database server. There is very limited scope in Quick Sight in terms of performance. However you can create a final table ( putting all complex sqls) and use the final table as Quick Sight source. This is good when your underline data is not changed frequently. Or you can enable SPICE mode and bring the final data to Quick Sight SPICE and use SPICE for reporting.

Are there recommended strategies for optimizing such queries when using window functions and multiple CTEs in Quick Sight?
[ Sanjeeb:] If you have multiple CTE, I will suggest push to Database layer and create a final table or materialized view on top of that.

Would it be better to push more of the logic to the Quick Sight front end (using calculated fields and table calculations) rather than handling everything in custom SQL?
[ Sanjeeb:] It depends. If you have calculation which are dynamic in nature, better to put at reporting layer. if you have any complex calculation, better to push at ETL or processing layer.

Any advice on ensuring Quick Sight properly recognizes date/datetime fields when converting from TIMESTAMP WITH TIME ZONE?
[ Sanjeeb:] Quick Sight should do the date format. if you can give samples where you are not able to convert or standardize the date format, community members can help you.

Any insights, optimization tips, or best pratices?
[ Sanjeeb :] There are some optimization tips are there in Quick Sight community. Please do a search. If you are not able to find, let us know.

Tagging other experts for their advise.
@WLS-DM @WLS-D @David_Wong @shravya @Biswajit_1993 @prantika_sinha

Regards - Sanjeeb

Hi @olusegun83,
It’s been awhile since last communication on this thread. Did you have any additional questions regarding your post?

If we do not hear back within the next 3 business days, I’ll close out this topic.

Thank you!

Hi @olusegun83,
Since we have not heard back, I’ll go ahead and close out this topic. However, if you have any additional questions, feel free to create a new post in the community and link this discussion for relevant information if needed.

Thank you!