What is document attach?
Dynamics GP 2013R2 introduces document attach, allowing files (photos, PDF’s, drawings, spread sheets, emails etc) to be associated, attached with objects in GP. An attachment icon appears on transactions and cards, window notes and many other places. An example may be attaching customer purchase orders and emails to sales orders.
in addition to providing a repository for documents related to GP, windows, transactions and master data, the documents for some GP objects can be set to “flow”. This means the attached document will also attach itself to downstream objects created from that object. Put simply, attaching a contract terms and conditions PDF to a customer and setting it to flow will make that document attach itself to quote, orders and then invoices created for that same customer automatically. The same applies to say a word health and safety (COSHH) data sheet attached to an item, the data sheet will follow the item through the transactions that include that item, imagine how helpful that would be exposed to say order enquiry on a eCommerce site view.
What did it replace?
Something had to be done to replace the aged OLE notes system that was in use by previous versions for basic attachments, for a number of reasons. The dawn of HTML web clients means some application architectures had to change to enable web browser remote delivery. The OLE notes was also often not well understood by system administrators. This frequently led to it being misconfigured, or configurations being corrupted or broken. I have often seen the OLE notes set back to default local storage on a PC, causing the documents on that workstation to ultimately be lost on that one machine. I have also seen multiple document stores created, due to different points in the network UNC path being set as the root of the store by different configs. I have even seen document stores getting misplaced on the network too. OLE notes were also locked away by the nature of the implementation, in a silo that was difficult to penetrate and thus housekeep. Writing scripts to remove or audit attachments for documents over retention periods was tricky challenge.
As the OLE notes were a second class feature, the importance and location of the store was not high on the radar of network admins, sometimes leading to in adequate backup and permissions being set on those folders. Also led to challenges in terminal server or citrix deployments.
How does it work and what challenges does it bring?
Now document attach puts the documents into BLOB fields in the Dynamics GP database, a great thing that everything is now encapsulated in one place. Then when I first saw the tweets from convergence on the announcement of document attach as a feature and it then sunk in what the consequences would be. The reasons for my horror, was that this was back before we had a SAN and VEEAM for virtualisation backups. I knew we had over 50GB of data in our OLE notes, and 200GB in our own custom document attach functionality. I assumed that it would be sensible to migrate that 200GB from our solution to the new native functionality. Our main GP company data base was 100GB, hardly fitting on the tape backups as it was and taking longer than we’d like to ship to our DR site. So this would make our 100GB grow to over (assuming extra meta data stored in DB) 350GB database, ouch!
At that time SQL storage was much more expensive storage than file storage, due to the higher spec of the drives required to serve SQL server compared file store. Also the servicing SQL backups was also more expensive in skills and rigour required.
Impact on test environments
It also filled me with dread that when we do our automated weekly restore of the live company to test, all this data would also be duplicated into the test company too with it now residing in the company database. This was also the case for when it was pushed into the developer clone of the live environment, the developer clone obviously having both live and test database as it is on purpose a duplicate of live. Previously OLE notes were not required for the test environments as they were simply not used in any test scenarios we had encountered. There was going to be approaching a terabyte of extra space required on those environments.
Often during development we’d restore SQL backups of the database to test, say test an integration, then restore again and again through testing. Triple the size of the restore would make this take a really impractical amount of time. Currently if a table got deleted in SQL server, restore of the company into another database, roll it forward to the point before the delete to recover the data would only take about half an hour, a new risk is now present if the restore take so much longer.
Steve Endow wrote a blog post about this, How will Dynamics GP Document Attach affect my database size? – my conclusion is that if you store customer PO’s, drawing for custom art work for products, quote requests, email trails etc against orders like we do, then yes it database size will rocket!
I looked an partitioning and advanced SQL server techniques but mostly they require SQL enterprise edition, have you seen the cost of running enterprise SQL server?!
So to mitigate the issues arising from the change we’ll,
- Keep our custom file store backed attachment system, it allows drag and drop from outlook and other locations that users don’t want to loose.
- Let the SAN soak up the data sizes
- Let modern backup techniques deal with the back up sizes
- Use the table storage to our advantage, it will allow better housekeeping management of the attachments, to help us keep the overall size down.
I found it interesting how Mark Polino sums it up well the same issues I was originally feeling in his post Four reasons why Microsoft Dynamics GP 2013 R2’s new Doc Attach is here to stay
The file attach functionality in GP has been effectively broken for a long time. It's clunky to use, easy to setup incorrectly, and easy to break. The new database storage model is harder on database admins and will require better education for customers, but in the end I think it's a better option for the business as a whole.
I think what Mark says is mostly true but if system admins now have the responsibility for this then give some more options around the backing store, and flexibility to handle what can be a very big chunk of storage differently to the rest of the database. Ideally in its own database, as an option, perhaps looking at SQL FILESTREAM would help too?