Dynamic Code Blocks

Microsoft Dynamics GP & .NET technical Blog by Tim Wappat

A way to automatically export formatted Excel files and send to customers

Email Excel files to customers from Dynamics GP

More and more customers are requesting product feeds to be sent to them via email, using their own proprietary, highly formatted Excel documents. In addition these feeds need to be sent daily or alternatively when data contained inside them changes. The expectation of the customer is that this will be manually performed by a person each week/month or day. As a developer we need to make people free to be productive, people are expensive, as is SQL server, so lets work SQL server harder and let the people work harder on the things people do better than machines!

Excel Gp Logo


It is possible to copy a template Excel file using SQL Server Integration Services (SSIS) and then populate it, finally emailing it. Although possible, I’ve found it a very fragile and difficult to configure the package to create and maintain this process. The excel drivers are not that good and data type management can be a pain in the backside. As more of these templates are based on the modern .xsls format, it becomes even more tricky as the drivers on our server are missing for that format. To make it work means installing them on the production server, with the correct 32/64 version. On a production server this becomes a task to run through test and into production and needs to be done each time a new SQL server is provisioned.

Like many system challenges there are dozens of ways to tackle this problem. Reporting services can export to excel, but it is pretty poor when control of formatting is required, it does some crazy stuff.

Automation of excel is possible using COM, but is not supported “unattended” on servers. Although I’ve used this a lot, it causes many issues in reliability and memory issues, I really don’t like it much.

There are many others, solutions out there, but I’m going to look at document generation from .NET.

Many, many years ago when the XML based office document formats came out I looked at the  the Open XML Format SDK 2.0 as a way to create an manipulate the new format Office documents. At the time the SDK seemed hard work to use, compared to the SSIS template technique. It was also heavy in dependencies and learning curve. Admittedly I don’t remember how much time I put into it and the SDK and examples of how to use it may have improved now, however I recently became aware that times have moved on (thanks Steve for pointing that out) and there are now some open source solutions to manipulating excel documents. After trying EPPlus, I have to say that I am impressed. The project has plenty of good example code, so getting started is super quick, an excel document exported and formatted from Dynamics GP in about ten mins from first visiting the project website to file exported!

As a software guy, this is so easy to use and suddenly manipulating Excel is no longer something to dread, excellent stuff! I then created a console application to allow the generation of excel files by using  a ”Execute Process Task” in SSIS (it simply runs the console app). This means the pain of generating formatted Excel output files is gone. Next I may refine this console application to include some configuration file to allow documents to be generated by just changing a configuration file, rather than having to recompile the code, when small document format or data changes are required. This pattern would be better for DevOps to change simple things relating to the outputted file.

By executing the console app from SSIS it keeps like concerns together and visible to the IT team, and keeps this integration grouped with many other data integrations, in one place, with the goodness of SQL job scheduling and failure reporting to network engineers.

The SSIS package checks to see if the excel file was output, if it was, it is emailed to the customer using a “Send Mail Task”. It is great to be able to keep the generation of the excel file as a task in the SSIS package and allow any other developer or IT staff the ability to dream up fancy integration work flows around it and allow them to change email addresses etc in a familiar way without involving me or configuration files. Even more of a bonus that it is all in the nice GUI of the integration designer. Now lets have a look at what I cooked up I know other ways to solve these problems exist, and many more hybrid solutions too, this works for us though.

Configure SSIS for Exporting Excel From Dynamics GP

The overall package looks like this:

 SSIS Package Flow overview

Login to SQL Server Business Intelligence Development Studio, selecting server type of Integration Services,

Login Integration Services Form

Now create a new Integration Services Project and create a new package in that project.

Prepare by putting the .NET assembly in the SSIS directory on a directory assessable to SSIS on the SQL server. This is the assembly that uses EPPlus to generate the excel file output.

Directory with .NET assembly in it

Drop an execute task onto a new package control flow surface. Configure it to run the excel .NET application that generates the excel file (custom .NET assembly). Pass any arguments required to generate the excel file, here you could use variables from the SSIS package if required to custom output depending on the particular SSIS integration taking place.

Execute Process Task

Put a script task in, here we run some C# code to check if the destination path and file passed in through the read-only variable on the script action exists, in order to decide if we need to send and email with that file attached or not.

Script Task

Use the Edit script button to write the following C#, it takes the package variable and returns a success or failure that can be used in the following package flow control.

Microsoft SQL Server Integration Services Script Task
Write scripts using Microsoft Visual C# 2008.
The ScriptMain is the entry point class of the script.

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace ST_a0846610f3774bad84a9085271662c6f.csproj
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase

#region VSTA generated code
enum ScriptResults
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure

The execution engine calls this method when the task executes.
To access the object model, use the Dts property. Connections, variables, events,
and logging features are available as members of the Dts property as shown in the following examples.

To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
To post a log entry, call Dts.Log("This is my log text", 999, null);
To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);

To use the connections collection use something like the following:
ConnectionManager cm = Dts.Connections.Add("OLEDB");
cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";

Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

To open Help, press F1.

public void Main()

if (System.IO.File.Exists((string)Dts.Variables["DestinationPath"].Value))
Dts.TaskResult = (int)ScriptResults.Success;
Dts.TaskResult = (int)ScriptResults.Failure;


In my case there was a compile error writing this script, around the variable pulling, this is documented issue where side by side versions of SQL server have been installed. Move the assembly below from the folder

C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Assemblies
into some backup folder, the error then goes (Use Google/Bing for details).


Next configure two send mail tasks, one for including attachment, another if you wish to get an email when the package executes but has nothing to send.

Send Mail Task Editor

It is necessary to set DelayValidation to True for the email steps as this prevents the package checking for the existence of the files at the start of the run, as at that time the output file may not exist yet.

DelayValidation Property, set to true


Select the package tab or node in solution explorer (SQL 2012+) , then from File menu “Save copy as…” to upload the package to the server package store with whatever encryption you wish. From there the package becomes available from SQL server Jobs to schedule.

Create the SQL job to run the package and run it, you should receive and email.

Preventing the package from failing due to script task failing

When SQL jobs fail, it is standard practice to notify the ops team so they can investigate. Unfortunately when no file exists, we fail that step for the purposes of flow control. Although the fail path is followed and an email to notify that there was no file to send is sent, there is a side effect. The failed script task propagates up to the package where it triggers as an error. As the max errors is set to one on the package, the overall package then reports failure.  This results in the SQL job then reporting the job as a failure, this is actually the goodness of SQL jobs we want to harness by using this overall technique.

Fix it by selecting the Events Handlers tab in the SSIS designer, use the drop downs to create an OnError handler for the Script Task, no need to put anything on the design surface, simply go to the variables pane, from there set the Propagate OnError  to False (you may need to open up the window a little to see all cols). This stops the error from going up fail the package.

Event handler for script task

Variables window, View>Other Windows>Variables or for 2012 onward see below

Propagate error variable

For SQL Data Manager 2012+

SQL 2010 SSIS Variable view for events


Also check the following are set to prevent FailPackageOnFailure and FailParentOnFailure for the properties of the event hander.

OnError Event Handler Properties

On setting these aspects up, the package should run and report a success even if the excel file has not been generated and thus the Script Task “fails”. In this implementation no excel file is generated if the data has not changed since the last time the excel file was ran (data cached in table and compared by SQL stored procedure that exports the data).

Schedule the SQL job and your customer will happily receive update and any errors will be reported to operational teams.

Find invalid selling units of measure for Microsoft GP

FROM IV00101
AND IV00101.SELNGUOM != ''

The above will find any selling unit of measure that are not contained in the item’s unit of measure schedule.

Validating data after econnect integration in Dynamics GP

After ingesting a customer purchase order into GP to create a sales order, it is a biz requirement to check that the pricing provided by the  customer matches that currently set up in GP pricing. This could be a tedious task for the sales user, drilling into each line of the order, checking the price against the current price in GP.

In this simple example, the customer submits in one currency and  has a one column (qty break) pricelist in the default GP pricing modules (no enhanced pricing used).

SQL Solution

eConnect provides a SQL event procedure procedure [dbo].[taSopHdrIvcInsertPost] that is fired after a document is pushed through eConnect.

Placing the following SQL into that procedure, inserts a line into the order notes for every line of the order that has a price different to that currently defined in the customer price list. This allows the user to look in one place for this information.

This same idea could be applied to other validation purposes, checking against master data of any kind. It is always wise to do sanity checking of data for when it arrives from untrusted sources (such as customers Be right back).

It would be better if this were done with in the code building the eConnect document, however sometimes a company may not have control of that code or they may need to put a temp fix in while the feature request is processed by a software development company.

+ convert(VARCHAR(max),
SELECT convert(VARCHAR(max), CHAR(10)
+ 'Price variation:'
+ convert(VARCHAR(40), orderline.LNITMSEQ / 16384)
+ CHAR(9)
+ convert(CHAR(12), orderline.UNITPRCE)
+ CHAR(9)
+ convert(CHAR(12), customerPrice.UOMPRICE)
+ CHAR(9)
+ rtrim(orderline.ITEMNMBR)
+ CHAR(9)
+ rtrim(orderline.ITEMDESC)
+ CHAR(9)
) AS [text()]
FROM SOP10200 orderline
LEFT JOIN IV00108 customerPrice ON orderline.ITEMNMBR = customerPrice.ITEMNMBR
AND customerPrice.PRCLEVEL = '{customerPriceLevel}'
AND customerPrice.CURNCYID = 'GBP'
JOIN SOP10100 oh ON oh.SOPTYPE = orderline.SOPTYPE
WHERE customerPrice.UOMPRICE != orderline.UNITPRCE
), ''))

The script assumes that there is already an order note present, in our case this is true, just change to an INSERT or MERGE statement, if that is more appropriate to your import. It also assumes currency GBP. Add some formatting to the prices to clear up the decimal places too, if you like, however the above script should give the foundations of something that can be adapted to your needs.

User connection to test or wrong company in Dynamics GP zDP_SY30000F_1

So you find that SQL monitor or profiling shows that every minute a user is access a company that you don’t expect to see accessing? How can this happen? This puzzled us for a little while today, we had a user accessing the test company, when we checked with the user, this was not the company there were currently logged into within GP.

Profiling showed the procedure being executed against the database is

exec DYNAMICS.dbo.zDP_SY30000F_1

The source of this is the user message functionality added in GP2013R2, see this previous post Check for User Messages(1) Dynamics GP Process Monitor. In that post I explain that GP polls a table every minute to see if there are any messages to show the user.

Below is a screen shot of SQL profiler filtered for that database and user. The single user’s name fills the blurred cols. The odd thing is that this user is not even logged into the company that owns that database in GP.


Check for default database settings

My gut told me to head for DSN settings, we checked and indeed there lay the issue. This is occurring because the DSN for the connection that GP is using has a default database set to that of the database this activity is seen on.

However trying to check for the existing default database by going into the ODBC Data Source Administrator, found in the control panel, to check the database default is not enough. The displayed database there does not reflect the currently set default database.


Instead to find the default, start an Excel query from Microsoft Query, selecting the  DSN used by GP, then clicking the more details button to see the database details. The default database should be master but in the scenario outlined it will be the database where all the activity is seen. Change it to master and the issue should cease.


Will it cause harm?

Although normally this should cause no harm as the query is actually targeted at the DYNAMICS database and is just passing through the selected company database, it could potentially be a problem. For example if the selected database is a test company and that test company database gets restored during operational hours. It can also confuse IT administrators that don’t understand the eccentricity of GP.