Skip to main content

TotalAgility Reporting Service Processing Rate


Question / Problem: 

Is it possible to determine how much data has been processed by the reporting service in the past?

Answer / Solution: 

The ETL process processes the raw reporting data in the reporting staging database into the data in the reporting database.  The following diagnostic query can be run against the TotalAgility_Reporting database when troubleshooting reporting issues to determine the rate of processing over time.  This can help determine if a problem started or stopped on a particular date.

DECLARE @days INT ; SET @days=7
SELECT @dtlast = MAX(etl_end_time) from etl_job

--look at records for all time (slow) or last x number of days
--SELECT @dt = MIN(etl_end_time) from etl_job --all time
SELECT @dt = DATEADD(DAY,-@days,@dtlast) from etl_job --last x days
CREATE TABLE #results (dt_execution_plus_minus_2hrs datetime, seconds_of_data_processed int,
dt_start_time datetime, dt_end_time datetime);
WHILE (@dt < @dtlast)
SELECT @dt = DATEADD(HOUR, 4, @dt)
INSERT into #results select @dt, DATEDIFF(SECOND, MIN(bssf.dt_end_datetime), MAX(bssf.dt_end_datetime)), 
    MIN(bssf.dt_end_datetime), MAX(bssf.dt_end_datetime) 
FROM batch_sess_snapshot_fact bssf
JOIN etl_job ej on (bssf.etl_job_key = ej.etl_job_key)
WHERE ABS(DATEDIFF(HOUR, ej.etl_end_time, @dt)) <= 2
SELECT * FROM #results 
--WHERE seconds_of_data_processed IS NOT NULL --ignore sections of time with no processing
DROP TABLE #results;

To use the query in a tenant database of an OPMT system, add either “live.” or “dev.” (depending on environment) before each occurrence of the table names. Tables names used here are etl_job and batch_sess_snapshot_fact.


  • Was this article helpful?