Download Attachments from Exchange Web Services using SSIS

I recently worked on an engagement where the requirement was to process excel spreadsheets sent to a group mailbox hosted in an on prem exchange server. The excel sheets were based on a standard template and had a reasonable enough level of validation to support data quality with loading into a mart. The interesting part of the engagement was building a custom task to process the emails and attachments sent to the group mail box. The choices for this solution were either a 3rd party tool or script task. The client didn’t want to purchase a 3rd party tool…so time to get my C# on.

You can download a sample package containing the script task here: http://ow.ly/MTEN3005InE

The script task is mostly a refactoring of the Exchange 101 code sample available here: https://code.msdn.microsoft.com/exchange/Exchange-2013-Get-d16d35bf

I altered the console app to use the default network credentials, so the task can run under a service account / proxy in SSIS. The script will also handle multiple emails with multiple attachments, the only thing you need to provide are the variable values for the following:

EmailAddr — email account

FileExtension — what file extension types to download from emails

OutputDir — Directory that you want the attachments saved to

Schedule the task to run at your desired interval. It will query the mailbox to find all emails with the attached file type, download the attachments to the output directory, and then delete the email(s).

Important to note that you must have the Microsoft Exchange Web Services Managed API 2.2 installed in your SSIS environment as well. Download available here: https://www.microsoft.com/en-us/download/details.aspx?id=42951

Hope this helps!

 

 

Advertisements

Bitwise XOR in DAX

In T-SQL and other languages we have bitwise XOR natively supported; the T-SQL operator is simply ^. Here’s the explanation of bitwise XOR for those not familiar:

https://en.wikipedia.org/wiki/Bitwise_operation

  1. Why would we want to implement this in DAX?
  2. How do we implement it?

1) I implemented XOR to ensure that only one table was being filtered from among a group of role playing dimension tables. This adds some validation around returning data to users when they may be using conflicting filter values.

2) The implementation is fairly straightforward. For any logical expression, we can implement a true/false condition, then return a value if only one condition is true.

Example:

I have a series of snapshot date filter tables for end of period values that all relate to a single snapshot fact table. The end date values in the filter tables are for week, month, and year for both Calendar and Fiscal period, for a total of six role playing tables (albeit with not exactly the same data, but some intersecting values). I want to remove the chance of rendering any values when more then one snapshot date is being filtered against facts by any of these tables.

XOR snapshot

Solution:

So, first we create a few intermediate measures to implement our conditions. In our specific case, I want to make sure only one of the role playing snapshot tables is filtered to a single date.

Measure #1 – Check that only one snapshot date table is filtered:

SnapshotXORFilter :=
IF (
    ISCROSSFILTERED ( ‘Calendar Month Snapshot'[Calendar Month Snapshot Date] )
        ISCROSSFILTERED ( ‘Calendar Week Snapshot'[Calendar Week Snapshot Date] )
        ISCROSSFILTERED ( ‘Calendar Year Snapshot'[Calendar Year Snapshot Date] )
        ISCROSSFILTERED ( ‘Fiscal Month Snapshot'[Fiscal Month Snapshot Date] )
        ISCROSSFILTERED ( ‘Fiscal Year Snapshot'[Fiscal Year Snapshot Date] )
        ISCROSSFILTERED ( ‘Fiscal Week Snapshot'[Fiscal Week Snapshot Date] )
        1,
    TRUE (),
    FALSE ()
)

Measure #2 – Check that only one date value is filtered across all the tables:

SnapshotXORValue :=
IF (
    HASONEVALUE ( ‘Calendar Month Snapshot'[Calendar Month Snapshot Date] )
        HASONEVALUE ( ‘Calendar Week Snapshot'[Calendar Week Snapshot Date] )
        HASONEVALUE ( ‘Calendar Year Snapshot'[Calendar Year Snapshot Date] )
        HASONEVALUE ( ‘Fiscal Month Snapshot'[Fiscal Month Snapshot Date] )
        HASONEVALUE ( ‘Fiscal Year Snapshot'[Fiscal Year Snapshot Date] )
        HASONEVALUE ( ‘Fiscal Week Snapshot'[Fiscal Week Snapshot Date] )
        1,
    TRUE (),
    FALSE ()
)

Measure #3 – Combine into a single expression for easier reuse:

SnapshotRender :=
IF (
    [SnapshotXORFilter] && [SnapshotXORValue],
    TRUE (),
    FALSE ()
)

Now, for all measures in my snapshot table, I can check against the SnapshotRender value before rendering like so:

On Hand Qty :=
IF ( [SnapshotRender], SUM ( [OnHandCount] )BLANK () )

…or more generally

Semi Additive Measure :=
IF ( [SnapshotRender], Function ( [ColumnValue] )BLANK () )

Migrating SQL Agent Jobs with Schedules using T-SQL

I’ve seen many a blog post about migrating SQL Agent jobs across instances using PowerShell, SMO, T-SQL, and the like. One thing I have not found readily is the ability to create jobs while maintaining the scheduling in place. For those who may not be aware, the easiest way to accomplish this is by using the SSIS Transfer Jobs Task: https://technet.microsoft.com/en-us/library/ms137568(v=sql.120).aspx

I’m providing a T-SQL alternative here for three potential case scenarios:

  1. SSIS is not enabled in your environment
  2. You don’t have rights to perform the action yourself, and need to provide a script
  3. You’d like to override certain job properties to a hard coded value on the target instance (for example, setting the owner for all jobs to be ‘sa’)

I developed the T-SQL based on the code blocks created when you script a job directly through SQL Server Management Studio (SSMS), which also follows the required steps as outlined here: https://technet.microsoft.com/en-us/library/ms181153(v=sql.105).aspx (which, for some reason, Microsoft only references through version 2008 R2). Run the script against the instance that you want to copy the jobs and schedules from.

The script will read the job definitions from the msdb system tables and output a script that performs the following when executed against a different instance:

  1. Creates the job categories if they do not already exist
  2. Creates the appropriate schedules if an existing job schedule with the same name does not exist
  3. Creates the job if an existing job with the same name does not exist
  4. Creates the job steps for all jobs matching job names from the originating instance where the step id does not already exist
  5. Attaches schedules to all jobs matching the configuration in the originating instance
  6. Sets the job start step based on the configuration in the originating instance
  7. Sets the job server to (local) for all jobs

The short take away is that the script is designed to be executed against a target instance that does not have jobs or schedules with the same names as the source instance. If there are identical names, make sure that the definitions are identical as well prior to running the script against the target. I’m also assuming that all jobs are local.

Things to note before running:

  1. Create the appropriate SQL Agent operator on the target instance. The script will only set the operator based on the default you provide (set as “DBAdmins” by default). If you need different operators for different jobs, then you’ll have to extend the script. Here’s how to create an operator: https://msdn.microsoft.com/en-us/library/ms175962.aspx
  2. All jobs and schedules will be created with ‘sa’ as owner. If that doesn’t work for you, then you’ll have to do some modifications to the script.

Ok, I’m starting to feel like this is turning into a pharmaceutical drug ad with all the disclaimers. Here’s the script already:

Download: http://ow.ly/hVxg3005IxH

Hope this helps!

/********************************************************
Script to create job definitions along with schedules
for migrating across instances.

Created by Jared Zagelbaum, jaredzagelbaum.wordpress.com
Created 5/21/2015

Follow me on Twitter: @JaredZagelbaum

Run the script against the instance that you want to copy the jobs and schedules from.

The script will read the job definitions from the msdb system tables and output a script that performs the following when executed against a different instance:

Creates the job categories if they do not already exist
Creates the appropriate schedules if an existing job schedule with the same name does not exist
Creates the job if an existing job with the same name does not exist
Creates the job steps for all jobs matching job names from the originating instance where the step id does not already exist
Attaches schedules to all jobs matching the configuration in the originating instance
Sets the job start step based on the configuration in the originating instance
Sets the job server to (local) for all jobs 

The short take away is that the script is designed to be executed against a target instance that does not have jobs or schedules with the same names as the source instance. 
If there are identical names, make sure that the definitions are identical as well prior to running the script against the target. I'm also assuming that all jobs are local.

Things to note before running:

Create the appropriate SQL Agent operator on the target instance. The script will only set the operator based on the default you provide (set as "DBAdmins" by default). 
If you need different operators for different jobs, then you'll have to extend the script. Here's how to create an operator: https://msdn.microsoft.com/en-us/library/ms175962.aspx

All jobs and schedules will be created with "sa" as owner. If that doesn't work for you, then you'll have to do some modifications to the script. 

*********************************************************/

USE [msdb]
GO

PRINT N'Use [msdb]'

/*******************************************
Create Job Categories
********************************************/

PRINT N'--Create Job Categories'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';

DECLARE @categoryname sysname
Declare categorycursor CURSOR FAST_FORWARD FOR
SELECT name FROM msdb.dbo.syscategories WHERE category_class = 1

OPEN categorycursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);



FETCH NEXT FROM categorycursor
 INTO @categoryname

  WHILE @@FETCH_STATUS = 0
 BEGIN


PRINT N'IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name= ' + char(39) + @categoryname  + char(39) + N' AND category_class=1)';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N''JOB'', @type=N''LOCAL'', @name=N''' + @categoryname + char(39);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);

FETCH NEXT FROM categorycursor
 INTO @categoryname

END
CLOSE categorycursor;
DEALLOCATE categorycursor;

PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'

GO

/*******************************************
Create Schedules
********************************************/


PRINT N'--Create Schedules'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';

DECLARE @schedule_name sysname
DECLARE @enabled int
DECLARE @freq_type int
DECLARE @freq_interval int
DECLARE @freq_subday_type int
DECLARE @freq_subday_interval int
DECLARE @freq_recurrence_factor int
DECLARE @active_start_date int
DECLARE @active_end_date int
DECLARE @active_start_time int
DECLARE @active_end_time int
DECLARE @owner_login_name sysname = N'sa' --update owner if required

Declare schedulecursor CURSOR FAST_FORWARD FOR

select [name] as schedule_name
,[enabled]
,freq_type
,freq_interval
,freq_subday_type
,freq_subday_interval
,freq_recurrence_factor
,active_start_date
,active_end_date
,active_start_time
,active_end_time
from msdb.dbo.sysschedules

OPEN schedulecursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);



 FETCH NEXT FROM schedulecursor
 INTO @schedule_name
,@enabled
,@freq_type
,@freq_interval
,@freq_subday_type
,@freq_subday_interval
,@freq_recurrence_factor
,@active_start_date
,@active_end_date
,@active_start_time
,@active_end_time


 WHILE @@FETCH_STATUS = 0
 BEGIN

PRINT N'IF NOT EXISTS (SELECT name FROM msdb.dbo.sysschedules WHERE name= ' + char(39) + @schedule_name  + char(39) + ')';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_add_schedule';
PRINT N'@schedule_name = ' + '''' + cast(@schedule_name as nvarchar(max)) + '''' + ',';
PRINT N'@enabled = ' + cast(@enabled as nvarchar(max)) + ',';
PRINT N'@freq_type= ' + cast(@freq_type as nvarchar(max)) + ',';
PRINT N'@freq_interval= ' + cast(@freq_interval as nvarchar(max)) + ',';
PRINT N'@freq_subday_type= ' + cast(@freq_subday_type as nvarchar(max)) + ',';
PRINT N'@freq_subday_interval= ' + cast(@freq_subday_interval as nvarchar(max)) + ',';
PRINT N'@freq_recurrence_factor= ' + cast(@freq_recurrence_factor as nvarchar(max)) + ',';
PRINT N'@active_start_date= ' + cast(@active_start_date as nvarchar(max)) + ',';
PRINT N'@active_end_date= ' + cast(@active_end_date as nvarchar(max)) + ',';
PRINT N'@active_start_time= ' + cast(@active_start_time as nvarchar(max)) + ',';
PRINT N'@active_end_time= ' + cast(@active_end_time as nvarchar(max)) + ',';
PRINT N'@owner_login_name= ' + char(39) + cast(@owner_login_name as nvarchar(max)) + char(39);
PRINT char(13) + char(10);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);



FETCH NEXT FROM schedulecursor
 INTO @schedule_name
,@enabled
,@freq_type
,@freq_interval
,@freq_subday_type
,@freq_subday_interval
,@freq_recurrence_factor
,@active_start_date
,@active_end_date
,@active_start_time
,@active_end_time

END

PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'


CLOSE schedulecursor;
DEALLOCATE schedulecursor;
GO

/*******************************************
Create Jobs
********************************************/

PRINT N'--Create Jobs'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';


DECLARE @job_name sysname
DECLARE @enabled int
DECLARE @notify_level_eventlog int
DECLARE @notify_level_email int
DECLARE @notify_level_netsend int
DECLARE @notify_level_page int
DECLARE @delete_level int
DECLARE @description nvarchar(512)
DECLARE @category_name sysname
DECLARE @notify_email_operator_name sysname =N'DBAdmins'	--set operator name here. Operator needs to already exist on target instance
DECLARE @owner_login_name sysname = N'sa'	--update owner if required


Declare jobcursor CURSOR FAST_FORWARD FOR

SELECT sj.[name] jobname
      ,[enabled]
      ,[notify_level_eventlog]
      ,[notify_level_email]
      ,[notify_level_netsend]
      ,[notify_level_page]
      ,[delete_level]
	  ,[description]
	  ,sc.[name] categoryname
  FROM [dbo].[sysjobs] sj
  INNER JOIN [dbo].[syscategories] sc
  ON sj.category_id = sc.category_id

  OPEN jobcursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);


 FETCH NEXT FROM jobcursor
 INTO @job_name 
 ,@enabled 
 ,@notify_level_eventlog 
 ,@notify_level_email 
 ,@notify_level_netsend 
 ,@notify_level_page 
 ,@delete_level 
 ,@description
 ,@category_name 


 WHILE @@FETCH_STATUS = 0
 BEGIN

PRINT N'IF NOT EXISTS (SELECT name FROM msdb.dbo.sysjobs WHERE name= ' + char(39) + @job_name  + char(39) + ')';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_add_job';
PRINT N'@job_name = ' + '''' + cast(@job_name as nvarchar(max)) + '''' + ',';
PRINT N'@enabled = ' + cast(@enabled as nvarchar(max)) + ',';
IF @notify_level_eventlog > 0	PRINT N'@notify_level_eventlog= ' + cast(@notify_level_eventlog as nvarchar(max)) + ',';

--Handle email notification unsupported in sp_add_job 
IF @notify_level_email > 0 AND @notify_email_operator_name IS NOT NULL	PRINT N'@notify_level_email= ' + cast(@notify_level_email as nvarchar(max)) + ',';
IF @notify_level_email = 0 AND @notify_email_operator_name IS NOT NULL	PRINT N'@notify_level_email= 1'  + ',';

IF @notify_level_netsend > 0	PRINT N'@notify_level_netsend= ' + cast(@notify_level_netsend as nvarchar(max)) + ',';
IF @notify_level_page > 0		PRINT N'@notify_level_page= ' + cast(@notify_level_page as nvarchar(max)) + ',';
IF @delete_level > 0			PRINT N'@delete_level= ' + cast(@delete_level as nvarchar(max)) + ',';
IF @description IS NOT NULL		PRINT N'@description= '+ char(39) + cast(@description as nvarchar(max))+ char(39) + ',';
IF @notify_email_operator_name IS NOT NULL PRINT N'@notify_email_operator_name= ' + char(39) + cast(@notify_email_operator_name as nvarchar(max)) + char(39) + ',';
PRINT N'@category_name= '+ char(39) + cast(@category_name as nvarchar(max))+ char(39) + ',';
PRINT N'@owner_login_name= ' + char(39) + cast(@owner_login_name as nvarchar(max)) + char(39);
PRINT char(13) + char(10);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);


 FETCH NEXT FROM jobcursor
 INTO @job_name 
 ,@enabled 
 ,@notify_level_eventlog 
 ,@notify_level_email 
 ,@notify_level_netsend 
 ,@notify_level_page 
 ,@delete_level 
 ,@description
 ,@category_name 
	
	END

PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'


CLOSE jobcursor;
DEALLOCATE jobcursor;
GO

/*******************************************
Create Job Steps
********************************************/

PRINT N'--Create Jobs Steps'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';


DECLARE @job_name sysname
DECLARE @step_name sysname
DECLARE @step_id int
DECLARE	@cmdexec_success_code int
DECLARE	@on_success_action tinyint
DECLARE @on_success_step_id int
DECLARE	@on_fail_action tinyint
DECLARE @on_fail_step_id int
DECLARE	@retry_attempts int
DECLARE	@retry_interval int
DECLARE	@subsystem nvarchar(40)
DECLARE	@command nvarchar(3200)
DECLARE	@output_file_name nvarchar(200)
DECLARE @flags int
--DECLARE @server nvarchar(30)
--DECLARE @database_name sysname
--DECLARE @database_user_name sysname

Declare jobstepcursor CURSOR FAST_FORWARD FOR

SELECT sj.[name] jobname
      ,[step_name]
      ,[step_id]
	  ,[cmdexec_success_code]
	  ,[on_success_action]
      ,[on_success_step_id]
	  ,[on_fail_action]
      ,[on_fail_step_id]
	  ,[retry_attempts]
	  ,[retry_interval]
	  ,[subsystem]
	  ,[command]
	  ,[output_file_name]
      ,[flags]
      --,[server]
      --,[database_name]
      --,[database_user_name]
  FROM [dbo].[sysjobsteps] sjs
  INNER JOIN [dbo].[sysjobs] sj
  ON sjs.job_id = sj.job_id
  ORDER BY sjs.job_id, sjs.step_id

    OPEN jobstepcursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);


 FETCH NEXT FROM jobstepcursor
 INTO @job_name, 
@step_name, 
@step_id, 
@cmdexec_success_code, 
@on_success_action, 
@on_success_step_id, 
@on_fail_action, 
@on_fail_step_id, 
@retry_attempts, 
@retry_interval, 
@subsystem, 
@command, 
@output_file_name, 
@flags --, 
--@server, 
--@database_name, 
--@database_user_name 


 WHILE @@FETCH_STATUS = 0
 BEGIN
PRINT N'IF NOT EXISTS (SELECT * FROM msdb.dbo.sysjobsteps sjs INNER JOIN msdb.dbo.sysjobs sj ON sjs.job_id = sj.job_id WHERE sj.[name] =' + char(39) + + cast(@job_name as nvarchar(max)) +  + char(39) + ' and step_id = ' +  cast(@step_id as nvarchar(max)) + ')';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_add_jobstep';
PRINT N'@job_name = ' +  char(39) + cast(@job_name as nvarchar(max)) +  char(39) + ',';
PRINT N'@step_name = ' + char(39) + cast(@step_name as nvarchar(max)) + char(39) + ',';
PRINT N'@step_id= ' + cast(@step_id as nvarchar(max)) + ',';
PRINT N'@cmdexec_success_code= ' + cast(@cmdexec_success_code as nvarchar(max)) + ',';
PRINT N'@on_success_action= ' + cast(@on_success_action as nvarchar(max)) + ',';
PRINT N'@on_success_step_id= ' + cast(@on_success_step_id as nvarchar(max)) + ',';
PRINT N'@on_fail_action= ' + cast(@on_fail_action as nvarchar(max)) + ',';
PRINT N'@on_fail_step_id= ' + cast(@on_fail_step_id as nvarchar(max)) + ',';
PRINT N'@retry_attempts= ' + cast(@retry_attempts as nvarchar(max)) + ',';
PRINT N'@retry_interval= '  + cast(@retry_interval as nvarchar(max))+ ',';
PRINT N'@subsystem= ' + char(39) + cast(@subsystem as nvarchar(max)) + char(39)+ ',';
PRINT N'@command= ' + char(39) + replace( cast(@command as nvarchar(max)), char(39), char(39) + char(39)) + char(39)+ ',';
PRINT N'@output_file_name= ' + char(39) + cast(@output_file_name as nvarchar(max)) + char(39)+ ',';
PRINT N'@flags= '  + cast(@flags as nvarchar(max)) --+ ',';
--PRINT N'@server= ' + char(39) + cast(@server as nvarchar(max)) + char(39)+ ',';
--PRINT N'@database_name= ' + char(39) + cast(@database_name as nvarchar(max)) + char(39)+ ',';
--PRINT N'@database_user_name= ' + char(39) + cast(@database_user_name as nvarchar(max)) + char(39);
PRINT char(13) + char(10);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);


FETCH NEXT FROM jobstepcursor
 INTO @job_name, 
@step_name, 
@step_id, 
@cmdexec_success_code, 
@on_success_action, 
@on_success_step_id, 
@on_fail_action, 
@on_fail_step_id, 
@retry_attempts, 
@retry_interval, 
@subsystem, 
@command, 
@output_file_name, 
@flags --, 
--@server, 
--@database_name, 
--@database_user_name 
	
	END

PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'


CLOSE jobstepcursor;
DEALLOCATE jobstepcursor;

  GO
/*******************************************
Attach schedules
********************************************/

PRINT N'--Attach schedules'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';

DECLARE @schedule_name sysname
DECLARE @job_name sysname

Declare scheduleattachcursor CURSOR FAST_FORWARD FOR

select ss.[name] schedule_name
,sj.name job_name 
from msdb.dbo.sysschedules ss
inner join msdb.dbo.sysjobschedules sjs
ON ss.schedule_id = sjs.schedule_id
inner join msdb.dbo.sysjobs sj
ON sjs.job_id = sj.job_id

OPEN scheduleattachcursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);

 FETCH NEXT FROM scheduleattachcursor
 INTO @schedule_name
 ,@job_name

 WHILE @@FETCH_STATUS = 0
 BEGIN

PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_attach_schedule';
PRINT N'@job_name = ' + '''' + cast(@job_name as nvarchar(max)) + '''' + ',';
PRINT N'@schedule_name = ' + '''' + cast(@schedule_name as nvarchar(max)) + '''' ;
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);

FETCH NEXT FROM scheduleattachcursor
 INTO @schedule_name
 ,@job_name

 END

 PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'

CLOSE scheduleattachcursor;
DEALLOCATE scheduleattachcursor;
GO

/*********************************************************
Set Job Server and Start Step
*********************************************************/

PRINT N'--Set Job Server and Start Step'
PRINT char(13) + char(10)

PRINT N'DECLARE @ReturnCode INT';
PRINT N'SELECT @ReturnCode = 0';


DECLARE @job_name sysname
DECLARE @start_step_id int

Declare jobcursor CURSOR FAST_FORWARD FOR

SELECT [name] jobname
      ,start_step_id
  FROM [dbo].[sysjobs] 


  OPEN jobcursor

PRINT N'BEGIN TRANSACTION';
PRINT char(13) + char(10);


 FETCH NEXT FROM jobcursor
 INTO @job_name 
 ,@start_step_id 
 

 WHILE @@FETCH_STATUS = 0
 BEGIN

PRINT N'IF EXISTS (SELECT name FROM msdb.dbo.sysjobs WHERE name= ' + char(39) + @job_name  + char(39) + ')';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_update_job';
PRINT N'@job_name = ' + '''' + cast(@job_name as nvarchar(max)) + '''' + ',';
PRINT N'@start_step_id = ' + cast(@start_step_id as nvarchar(max));
PRINT char(13) + char(10);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);

-- Add all jobs to local server (can be easily altered to support remote target servers if required)
PRINT N'IF NOT EXISTS (SELECT name FROM msdb.dbo.sysjobs sj INNER JOIN msdb.dbo.sysjobservers sjs ON sj.job_id = sjs.job_id  WHERE name= ' + char(39) + @job_name  + char(39) + ')';
PRINT N'BEGIN'
PRINT N'EXEC @ReturnCode = sp_add_jobserver';
PRINT N'@job_name = ' + '''' + cast(@job_name as nvarchar(max)) + '''' + ',';
PRINT N' @server_name = N' + char(39) + '(local)' + char(39);
PRINT char(13) + char(10);
PRINT N'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback';
PRINT N'END'
PRINT char(13) + char(10);



 FETCH NEXT FROM jobcursor
 INTO @job_name 
 ,@start_step_id 
	
	END

PRINT char(13) + char(10);
PRINT N'COMMIT TRANSACTION'
PRINT N'GOTO EndSave'
PRINT N'QuitWithRollback:'
PRINT N'    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION'
PRINT N'EndSave:'
PRINT N'GO'


CLOSE jobcursor;
DEALLOCATE jobcursor;
GO

Synchronizing Read / Write SQL Agent Jobs Across AlwaysOn Replicas

Intro

AlwaysOn availability groups are a powerful enhancement to SQLServer HA, and offer many improvements over older database synchronization techniques like mirroring and log shipping. However, there is also additional complexity involved in implementing this technology, and one of the stickiest areas for database administrators is having to deal with non-contained objects, or objects that exist at the instance level, but not the database level (logins vs users, as an example).

For new development, contained vs. uncontained objects can be accounted for and considered during application design. But what about those existing mission critical legacy applications that we want to benefit from AlwaysOn? Those are, undoubtedly, the instances that would benefit from automated failover and read / write workload balancing the most! Well, fortunately these older implementations can be modified to work seamlessly with AlwaysOn, but the instances they run on will most likely need some intelligent customization. Here’s one situation you might encounter: legacy applications using SQL Agent jobs to update data on a schedule.

Overview

In SQL Server AlwaysOn, all system level objects are non-synchronized, including SQL Agent. The assumption is that all applications need only database level objects to function, and that system level objects are only required for maintenance and administration of the instance. I had a client with a legacy mission critical application that they wanted to implement with AlwaysOn. This application relied and several SQL agent jobs to update data on a scheduled basis in order to maintain application integrity. Less then optimal, I know, but that’s what you get with legacy. The blocker with this implementation is that the jobs would fail if executed on a secondary replica, because the primary is the only write enabled replica.

So, how do we customize AlwaysOn replicas to ensure that these agent jobs are always and only running on the primary replica?

Solution

The solution I implemented is composed of four parts: A system function, a user stored procedure, a SQL Agent job, and a SQL Agent alert. The key to this implementation is ensuring that all of the jobs you need to manage are maintained in their own user defined category(s).

System Function

Microsoft provides us with a handy-dandy little system function: sys.fn_hadr_is_primary_replica. This function accepts one parameter, @dbname sysname, the name of a synchronized database, and “returns 1 if the current instance is primary replica; otherwise returns 0.” This is what we’ll use in our procedure to determine whether to enable or disable the read/write SQL Agent jobs.

Stored Procedure

We can now create a stored procedure to enable / disable SQL Agent jobs in a user defined category based on, coincidentally, the same bit value that sys.fn_hadr_is_primary_replica returns:

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO


/* =============================================
-- Author:		Jared Zagelbaum, jaredzagelbaum.wordpress.com
-- Follow me on twitter: @JaredZagelbaum
-- Create date: 4/23/2015
-- Description:	Enables Read / Write jobs on primary replica after failover
-- Can be executed from failover alert and / or recurring schedule
-- =============================================
*/
CREATE PROCEDURE [dbo].[JobChange_AGFailover]  
(
@databasename sysname
,@jobcategory sysname
)
AS

BEGIN
	
  SET NOCOUNT ON;

  DECLARE @jobID UNIQUEIDENTIFIER  --variable for job_id 
  DECLARE @AG_enable tinyint
 
  select @AG_enable = sys.fn_hadr_is_primary_replica (@databasename)
   
  DECLARE jobidCursor CURSOR FOR   --used for cursor allocation 
 
  SELECT j.job_id 
  FROM msdb.dbo.sysjobs j
  INNER JOIN msdb.dbo.syscategories c
  ON j.category_id = c.category_id
  where c.name = @jobcategory

--update jobs
  OPEN jobidCursor
  FETCH NEXT FROM jobidCursor INTO @jobID
  WHILE @@Fetch_Status = 0     
  BEGIN                
        EXEC msdb.dbo.sp_update_job @job_id=@jobID, @enabled = @AG_enable                 
		  FETCH Next FROM jobidCursor INTO @jobID    
		   END 

  CLOSE jobidCursor
  DEALLOCATE jobidCursor

END


GO


 SQL Agent Job

Having fun yet!? Now we can add this s proc to a job. Here’s a script that will create the job for you using the much preferred CmdExec method (assuming you create [dbo].[JobChange_AGFailover] in master). Make sure you update the supplied values where noted.

USE [msdb]
GO

/****** Object:  Job [AG Failover Replica Changes]    Script Date: 5/5/2015 2:17:49 PM ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0

IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'AG Failover Replica Changes', 
		@enabled=1, 
		@notify_level_eventlog=2, 
		@notify_level_email=2, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'Enables / Disables read-write jobs against replicas when failover role change occurs', 
		@category_name=N'[Uncategorized (Local)]', 
		@owner_login_name=N'sa',						-- change if necessary per your environment
		@notify_email_operator_name=N'DBAdmins',		-- change to your default operator
		@job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Exec JobChange_AGFailover]    Script Date: 5/5/2015 2:17:49 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Exec JobChange_AGFailover', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'CmdExec', 
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[JobChange_AGFailover] @Databasename = ''Databasename'', @jobcategory = ''jobcategory'' " -b', --supply your own databasename and jobcategory parameter values
		@output_file_name=N'$(ESCAPE_SQUOTE(SQLLOGDIR))\AGFailover_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt', 
		@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:

GO


 SQL Agent Alert

Finally, lets create an alert to trigger this job whenever a failover occurs. The alert should be raised based on Error number 1480. Here’s a script for it:

USE [msdb]
GO

EXEC msdb.dbo.sp_add_alert @name=N'AG Role Change', 
		@message_id=1480, 
		@severity=0, 
		@enabled=1, 
		@delay_between_responses=0, 
		@include_event_description_in=1, 
		@category_name=N'[Uncategorized]', 
		@job_name=N'AG Failover Replica Changes'
GO

 Result

You should now have an instance of each of these objects on all of your replicas:

Stored Procedure:

sproc

SQL Agent Job:

Job

SQL Agent Alert:

Alert

Alright, we now have an automated way of enabling and disabling read / write SQL Agent jobs based on whether the replica role is primary or secondary. The alert will be raised whenever a failover occurs. If, for some reason the alert does not trigger successfully (I’ve not encountered this), you can always start the job manually on the affected replica as well.

The final result that we want is that SQL jobs that edit data are disabled on secondary replicas and enabled only on the primary by job category. Multiple jobs and alerts can be added for different databases and/ or categories. You can view the enabled property value easily under Job Activity Monitor in SQL Server Agent.

Hope this helps!

Automated Restore Script Output for Ola Hallengren’s Maintenance Solution

Intro

This blog post is designed to be useful to two different audiences:

  1. Those implementing Ola Hallengren’s Maintenance Solution for the first time
  2. Those that have been using it for awhile, but might have a hole in their DR plan – what to restore when a resource goes offline

There’s numerous blogs and presentations out there about Ola’s Maintenance Solution. I don’t feel a strong need to try to replicate that information here. What I am presenting is a custom extension that works out of the box for those of you that have created jobs using the maintenance solution. The scripts will add an additional job step to all DatabaseBackup jobs which will output a continuously updated text file that contains the most current restore scripts for each database in the instance.

Solution

Let’s just get to it then, shall we?

You can download the latest version of my scripts here: https://github.com/jzagelbaum/OlaHallengrenRestoreScript

Please note that there are three requirements for the solution to function correctly:

  1. The DatabaseBackup jobs must use @LogToTable = ‘Y’
  2. In order for the generated restore script to be valid, at least one full database backup should be logged to the CommandLog Table
  3. SQL Server agent should have read / write access to the directory where the restore scripts will be written. The default is the @Directory value.

If you don’t know what @LogToTable, CommandLog, and @Directory are, then please read up on Maintenance Solution before continuing.

Implementation

RestoreCommand Stored Procedure

Use the same database that the Maintenance Solution objects are created in; the default is Master. First create the RestoreCommand s proc. This is the procedure that will be executed each time the DatabaseBackup jobs are run. There are no input parameters for this procedure. It reads the records from CommandLog and outputs the restore commands for each database being backed up on the instance. This includes log, diff, and full based on last completed backup and backup type. You can create the procedure and execute it to view the output as a sample:

RestoreCommandOutput

Please, pleaseplease do not just create this procedure without implementing the job script step. The whole point of this solution is to have a standby restore script available if your instance / database becomes unusable (and you’re not clustering, etc.). I’ve really tried to make it as easy as possible to add the job step (assuming you’re using the OOTB maintenance solution created jobs). That being said…

Create Restore Script Job Steps

This script accepts two parameter values:

  1. @DatabaseName, the database where the Maintenance Solution and RestoreCommand objects are located (default of Master)
  2. @RestoreScriptDir, accepts ‘Backup_Dir’, ‘Error_Log’, or custom defined dir, e.g., ‘C:\’. If the dir is a custom value, the directory must be created prior to running the job.

For the @RestoreScriptDir parameter, the default value ‘Backup_Dir’ places the script in the instance level folder in the directory of the backup files, which is determined by the value of the @Directory parameter. ‘Error_Log’ places the restore script in the same dir as the default location of Maintenance Solution output files.

Make sure SQL Agent has read / write access to any custom directory if used.

After the script is run, you now have an additional job step in all of your DatabaseBackup jobs:

RestoreScriptStep RestoreScriptStepProp

RestoreScriptStepPropAdv

Result

Now, each time a DatabaseBackup job is run, a continuously updated text file containing the latest restore script is written to the appropriate instance folder of the configured directory. Here is a screenshot using the default backup directory:

RestoreCommandJobResult

RestoreCommandJobResultScript

 Parting Thoughts

So, you now have the power to restore your instance at the database level up to the last verified, completed backup. No GUI, no guesswork, and most importantly, no trying to figure out what needs to be restored when you have the least amount of time to think about it.

Hope this helps!

Delete unused SQL Agent jobs from instance

I recently worked on a project where the client requested to “remove all of the unused jobs” that had been installed across their production instances over time. I originally thought that using the enabled property would lead me to the right jobs, but as that is user dependent, and the fact that users created the mess in the first place, that didn’t end up being the best way to thin out the herd.

A couple of more iterations led me to this solution, which looks for sql agent jobs that have never been run, and are not currently scheduled, then executes sp_delete_job against all applicable job_ids. A more conservative approach could be to modify the script to use sp_update_job and simply flip the enabled bit. As I mentioned before, the enabled property wasn’t being well maintained across the instances, and it was a short term engagement, so I opted for this more aggressive approach:

--Script to delete sql server agent jobs that have never been executed or scheduled
--Authored by Jared Zagelbaum 4/2015    jaredzagelbaum.wordpress.com

DECLARE @jobID UNIQUEIDENTIFIER  --variable for job_id 
 DECLARE jobidCursor CURSOR FOR   --used for cursor allocation 
 
 --jobs never run 
  SELECT j.job_id FROM msdb.dbo.sysjobs j
left outer join msdb.dbo.sysjobhistory jh
on j.job_id = jh.job_id
where jh.job_id is null 

--jobs not scheduled to run

and j.job_id NOT IN

(
select job_id
from msdb.dbo.sysjobschedules
)

-- deletion
  OPEN jobidCursor
  FETCH NEXT FROM jobidCursor INTO @jobID
  WHILE @@Fetch_Status = 0     
  BEGIN                
        EXEC msdb.dbo.sp_delete_job @job_id=@jobID                  
		  FETCH Next FROM jobidCursor INTO @jobID    
		   END 
		   CLOSE jobidCursor
		   DEALLOCATE jobidCursor

Consolidating time series metadata for users in Tabular

Background

DAX offers many powerful time intelligence functions that can be applied at virtually any level of a date / time hierarchy. There are also custom patterns available that can be implemented for week based and ISO 8601 calendars that will emulate the same functionality (case in point, see Time Patterns articles on DaxPatterns.com as well as other articles on SQLBI.com).

An issue that arises with the implementation of time intelligence in DAX, however, is that the default behavior is to create a new metadata entry for every measure created in the model. This behavior can quickly propagate your cube metadata into something confusing and error prone to users.

Overview

As an example, consider a common requirement: YTD, QTD, and MTD measures. Assuming a standard Gregorian calendar, these measures could be implemented in the AdventureWorks 2012 Tabular Model as follows:

Note: QTD and MTD would use the same calculation with the exception of replacing TOTALYTD in the DAX formulas with TOTALQTD and TOTALMTD, respectively.

YTD Internet Sales Amount :=
IF (
    MIN ( ‘Date'[DateKey] )
        <= CALCULATE (
            MAX ( ‘Internet Sales'[OrderDateKey] ),
            ALL ( ‘Internet Sales’ )
        ),
    TOTALYTD ( [Total Internet Sales Amount], ‘Date'[Date] )
)

Extending the requirement to include comparisons against previous year, we could implement these measures as well:

PY YTD Internet Sales Amount :=
IF (
    MIN ( ‘Date'[DateKey] )
        <= CALCULATE (
            MAX ( ‘Internet Sales'[OrderDateKey] ),
            ALL ( ‘Internet Sales’ )
        ),
    TOTALYTD (
        [Total Internet Sales Amount],
        SAMEPERIODLASTYEAR ( ‘Date'[Date] )
    )
)

If we save and browse our model with these definitions in place, we end up with this result:

Unconsolidated Excel Sample

Notice how MTD amounts incorrectly render at the quarter and year levels of the date hierarchy. Likewise for QTD at the year level as well. These issues are also repeated for the PY measures.

The “quick and dirty” solution to this problem would be to hide any values for *TD measures where an inappropriate date attribute is used in the pivot / report. This would eliminate the risk for errors in reports (or, more likely, the risk of the developer losing all credibility). However, there still remains a major unresolved issue with this approach: Every corresponding measure for which there is a *TD requirement would have metadata in triplicate. Users would have to remember to select MTD, QTD, or YTD versions of each measure from the model depending on which time intelligence function was desired for reporting. Likewise, a report with drilldown on the date hierarchy would basically end up being dumped to a spreadmart, or become a developer created / assisted task as the ad hoc layout of such a report from the cube would be 3x larger then necessary.

Solution

Enter the Switch() function. Adding a check against the date hierarchy context being filtered in the query, this function can consolidate and render the appropriate *TD measure while exposing only a single metadata entry in the model. We can now meet a common requirement to see cumulative amounts as such:

Note: PY Internet Sales Amount To Date measure would reference PY versions of the same measures

Internet Sales Amount To Date :=
SWITCH (
    TRUE (),
    ISFILTERED ( ‘Date'[Day Of Month] ), [MTD Internet Sales Amount],
    ISFILTERED ( ‘Date'[Month Name] ), [QTD Internet Sales Amount],
    ISFILTERED ( ‘Date'[Calendar Quarter] ), [YTD Internet Sales Amount],
    ISFILTERED ( ‘Date'[Calendar Year] ), [YTD Internet Sales Amount]
)

The intermediate *TD measures can now just be hidden from client tools in the model, and used only as intermediate calculations by the measures implementing the Switch() function. This allows us to consolidate metadata and business rules into a single measure per time series like so:

Consolidated Metadata

Here’s how the data looks based on the Switch() rules used above (dates are filtered through to 5/25/2008 to show *TD functionality against PY):

Consolidated Excel Results

The Internet Sales Amount To Date and corresponding PY measures now show only the required cumulative totals at the appropriate level of the date hierarchy. Here it is MTD at day level, QTD at month, and YTD elsewhere. This can be easily changed to meet different requirements purely in the Switch() function.

Parting Thoughts

The nice part about this implementation is that it is naturally extensible thanks to DAX’s built in inheritance. Creating a single YOY cumulative calculation is simply:

YOY Internet Sales Amount To Date := [Internet Sales Amount To Date]- [PY Internet Sales Amount To Date]

…and so on, etc..

The DAX implementation of time series metadata is still not as nicely abstracted as Multidimensional yet, but there are alot of tools you can use to simplify tabular models to meet most user requirements. Hope this helps!

SSRS SharePoint Integrated Mode – Report Server URL

This is a short, but hopefully sweet post for someone, whom, like me, might have had to end up spending a few hours trying to find the Report Server URL for SSRS 2012 (or newer)  SharePoint integrated Mode. I found it in Microsoft documentation, but I don’t remember where, so here it is again:

http(s)://<top level site>/_vti_bin/reportserver

The key to this extremely cryptic puzzle was to add the “/_vti_bin”. Not easy to find if you are hunting online.

Update all SSRS reports to apply consistent header / footer on demand

Teaching good old SSRS a new trick…

Background

Not too long ago I had a client with a very simple, yet practical requirement: they wanted all of their SSRS reports to have standard headers and footers. Nothing ground breaking there, however, the twist that they put on this requirement was a very useful one in that they wanted to use a single template file to apply the headers and footers. Even more, they wanted this template to be easily updatable, meaning any update to the template would apply the changes to all reports simultaneously. Ah, now we’re on to something more interesting.

The original solution that I developed with a colleague to meet this requirement was based on local report definitions (.rdlc – or SSRS without the reportserver), but I thought it would be interesting, and hopefully helpful to others to offer a solution that could be applied against the full version of SSRS (stand alone and SharePoint Integrated). So, here we go…

Overview

A few points of information about SSRS architecture before we get to specifics (for those who might not already be familiar). The .rdl files that define report definitions are generally authored in either visual studio or report builder. These files are actually xml documents that reference a published schema provided by Microsoft (for SSRS 2012 and 2014, the schema is Report Definition Language (RDL) 2010: http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition/

SSRS also offers a robust API… and even better for the less .Net savvy such as myself, SSRS comes complete with a command line executable utility that will compile and run a “Visual Basic .NET code file, written against a proxy that is built on Web Service Description Language (WSDL), which defines the Reporting Services SOAP API”. This is called the rs Utility, or (rs.exe): http://msdn.microsoft.com/en-us/library/ms157252(v=sql.110).aspx

In the past, I have only used the rs Utility for scripting deployments and backups; but, why not use it to update report definitions? It already exposes the necessary web service method: SetItemDefinition.

Solution

The implementation is actually fairly straight forward (after I racked my brain for a couple of days). As I said before, the original solution ran within a custom app that implemented LocalReport. We simply updated the appropriate .rdlc XML elements at run time against the defined template document. Transposing to the full report server, we just need to identify the template document on the server, then apply the appropriate XML elements to all of the report files in a loop instead.

A couple of things to be aware of about your template and report .rdl files.

1) Make sure all of the items in your Custom.rdl file are named creatively, as in something that a report developer wouldn’t normally use. When the script runs, it will replace items in the header and footer with your report items in the template. SSRS requires that all items within a report definition have unique names, otherwise the definition is not valid. So, If you have a textbox in your template named Textbox1 and you try update all of your report files, there’s an excellent chance that you’ll end up with the following error:

rs exec header error

I suggest naming each report item in the header like “CustomHeaderImageLeft20141114”.

2) Don’t use an embedded image in your template file. This is because the script just won’t work If you do, but also, if you’re applying the image to all of your report definitions then you’re better off using an external reference image anyways. The image itself can just be uploaded to your report server.

So, here’s the code for the script file (.rss):

'Script to apply custom header and footer from Custom.rdl file saved to ReportServer to all report files on ReportServer
' uses Mgmt2010 endpoint / executable against stand alone or SharePoint integrated instance
' Save file as .rss extension and run using rs.exe from command line.
'Authored by Jared Zagelbaum 11/2014    jaredzagelbaum.wordpress.com
Sub Main()
Dim reportDefinition As Byte() = Nothing
Dim doc As New System.Xml.XmlDocument
Dim nsmanager As New XmlNamespaceManager(doc.NameTable)
Dim templateHeader As System.Xml.XmlElement
Dim templateFooter As System.Xml.XmlElement
Dim reportHeader As System.Xml.XmlElement
Dim reportFooter As System.Xml.XmlElement
nsmanager.AddNamespace("rd", "http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition")
Dim items As CatalogItem() = rs.ListChildren("/", True)

'find template file
For Each item As CatalogItem In items
If item.TypeName = "Report" And item.Name = "Custom" Then
reportDefinition = rs.GetItemDefinition(item.Path)
Dim stream As New MemoryStream(reportDefinition)
doc.Load(stream)
Exit For
End If
Next

'load template header / footer into var
templateHeader = doc.SelectSingleNode("/rd:Report/rd:ReportSections/rd:ReportSection/rd:Page/rd:PageHeader", nsmanager)
templateFooter = doc.SelectSingleNode("/rd:Report/rd:ReportSections/rd:ReportSection/rd:Page/rd:PageFooter", nsmanager)

'iterate through catalog items and replace report headers with template
For Each item As CatalogItem In items
If item.TypeName = "Report" And item.Name <> "Custom" Then
reportDefinition = rs.GetItemDefinition(item.Path)
Dim stream As New MemoryStream(reportDefinition)
Dim outstream As New MemoryStream()
doc.Load(stream)
reportHeader = doc.SelectSingleNode("/rd:Report/rd:ReportSections/rd:ReportSection/rd:Page/rd:PageHeader", nsmanager)
reportFooter = doc.SelectSingleNode("/rd:Report/rd:ReportSections/rd:ReportSection/rd:Page/rd:PageFooter", nsmanager)
reportHeader.InnerXml = templateHeader.InnerXml
reportFooter.InnerXml = templateFooter.InnerXml
doc.Save(outstream)
reportDefinition = outstream.ToArray()
rs.SetItemDefinition(item.Path, reportDefinition, Nothing)
stream.Dispose()
outstream.Dispose()

End If
Next

End Sub

Save this script using .rss extension and execute from the command line on your reportserver from any directory.

Parting Thoughts

The great part about coding against the Mgmt2010 endpoint is that the script can run against stand alone or integrated SharePoint mode without change.  However, If you have an older version of SSRS, then you will have to alter the name space reference and xpath to the appropriate version / schema. There may / may not be different API calls involved as well.

One could get even more detailed with this script and add common items to anywhere in a report; datasets, etc. In those cases, it would also be better to create a reference to a shared dataset instead of scripting a new one into every single report.

I originally said that we were teaching SSRS a new trick, but the API / rs.exe utility has been around for several years now. Just like training a dog, most of the work is what the human has to learn. Hope this helped!