Tuesday, 17 January 2017

SQL Server Daily Monitoring Tool Part 2 - Keep Reading -Application & Backup job Status

Hi Friends,

Now we have the basic setup ready Part-1 to create our Daily monitoring packages to centralize all Health data from all SQL server.

Lets Summarize what we achieved in Part-1 

  • Daily monitoring Package structure 
  • Check List 
  • Metadata Structure 
  • Dynamic Connection In SSIS to connect to multiple server using Single Connection Manager
  • Configured 1st SSIS package to create SQL server inventory 
Download package for Part-1 from here Populate_Server_inventory

Linked Server:In some package we will need this.
1st  & most prerequisite to fetch data from server is to Create linked server on All other server for our Local server using same SQL user ID.So make sure you have linked server created on all servers.

1.Application Jobs :Application jobs are those which are created by or requested by users to run on scheduled window (Via SQLAgent).In simple way User jobs
    SO what parameters we need in application Job Monitoring 
  • We need to Fetch Job history form all Server MSDB to our centralize table 
  • Once we have the History we can filter columns in SSRS report for visualization 

Metadata For Application Job Table ApplicationJobStatusTableScript

Now We have structure ready Lets move to create Package.

Package will look like this for Application job status

So we need  2 Data flow Tasks,2 Execute SQL task ,1 For each loop container, 1 Sequence Container  

Step 1 : 

Design SSIS Package to get Application Job details 

  • Open BIDS or SQL server  Data tools
  •  Open recently Created project for DM tool 
  •  Now Go to Solution explorer-->Right Click --> New SSIS Package 
  • Rename this package to "Application  & Backup Job Status.dtsx"  
    Step :2 
     Create All the variables  & Connection manager created in Part 1  

     Step :3 
Now In this package we have 2 different part one is Application job status  & Another is backup job status to differentiate both we will  use Sequence Container.

Sequence Container in SSIS Package : 
It makes simple to divide the control flow in a package into group of tasks and containers that you can manage as a unit.

  • Go to SSIS toolbox on the right side If not you can also enable it from Menu Click anywhere in the package and then -->Click on SSIS menu and then -->Select SSIS Toolbox

  • Drag and Drop the Sequence container in package 


1.Archive Job Status Data Flow Task
  • Drag and drop Data flow task --> This data flow task will be to archiving daily data of current job_Status table to Archive_jobStatus table for historical purpose.
  • Click On Data Flow Task --> Drag and drop Oledb Source  & Oledb Destination component from SSIS Toolbox

  • Click job_status Task 
  • Select your local server - Server where you created Job table 
  • Data Access mode - Table & View 
  • Name of the table : Job_Status 


  • Go to the columns and check if all columns check-box are Ticked.
  • Now Click on Archive_Job_Status task 
  • Select your local server - Server where you created Archive Job table 
  • Data Access mode - Table & View 
  • Name of the table : Archive_Job_Status



  • We are done with 1st flow task 
2.Truncate Job_Status table Execute SQL task 




  • Drag and drop Execute SQL task --> This Execute SQL task purpose to truncate all data from Main Job_Status table before putting any new data for the day.
  • Click--> Execute SQL task
  • Select SQL Source type : Direct input 
  • Click .. box on SQL Statement 
  • Write a query to truncate table 
  • Query "Truncate Table Jobs_Status"


  • 3.Populate Server List of DMZ Server :-Execute SQL task 



  • Drag and drop Execute SQL task --> This Execute SQL task purpose to select all server names from ServerInfo_SSIS table to connection manager for dynamic connection 
  • Click--> Execute SQL task
  • Select SQL Source type : Direct input 
  • Click .. box on SQL Statement 
  • Write a query to Select Servers from table 

  • Query : 
    "SELECT     LTRIM(RTRIM(Server)) AS servername
    FROM         ServerInfo_SSIS
    WHERE     (Connect = 1) AND (DMZ = 1) AND (Version >= 10)
    order by LTRIM(RTRIM(Server)) Desc"
    I have put filters for the version as some of DMV's for job history dont work with version  <10




    • Then Go to Result Set --> Select SQL_RS variable to store Server names as shown in screen clip















    • Click OK 
    • Now Drag and Drop For each Loop container and then Data flow task as shown below in SQL 2005,200,2012 job Info table




     4. For Each loop container

    • Click on For each loop container 
    • Select "Foreach ADO Enumerator"
    • Go To collection --> Select SQL_RS variable 


    • GO to Variable mapping and select "SRV_Conn" variable for mapping.

    • Now Click on Data flow task you just added into For each loop container 
    • Drag  & drop Oledb Source   &  destination component 


    Query Download File from here : JobStatusQuery



    • Check mappings and click OK.

    Finally ....Package is ready to fetch data for application job status :-). 

    Now to Filter this table for SSRS report We need below Query which will only select required column data from table.


    Design SSRS Dashboard : 

    Will come soon with this part As of now below is the query which will filter the required columns from the table and display in Dashboard.(you can alter the query depend upon your requirement) 



    Query : 

      SELECT SERVER,NAME,last_run_outcome,last_run_date FROM DBO.JOBS WHERE last_run_outcome = 0 AND Cast( convert(Date,convert(varchar,last_run_date)) as date) >= convert(DATE,GETDATE()-1) AND convert(Date,Package_run_date) >= convert(DATE,GETDATE()-1) 


    In SSRS Dashboard it will look like this 


    1 . Without expanding 


    2. After Expanding individual Servers 



    We will discuss this in later parts about how to Design SSRS to look like dash board :-)

    Now we will develop same package for 
    2.DBA Jobs : Maintenance Jobs,Re indexing Jobs,DB Alerts Jobs etc.. ..


    Coming Soon ...............