Pause on Error: Darn it, I wish I’d Heard this Sooner

If you read my last blog, you know about my challenge with one of my current projects.  Love the client.  The database . . . not so much.


We’re in the process right now of moving the history to the static fields in the archive file, which is a massively time consuming process — just exporting each table takes forever.  Once that’s done, I can rewrite the reports using the static values and unchanging, permanently captured opening and closing month end balances.


Sidenote:  if for any reason you’re building a solution that includes month end values . . . please, for the love of GOD don’t make them continually recalculate.  Save the values.  As it is right now, if I want to look at a month end balance for one of their customers from, say, December 2014, I guarantee you the numbers will not match the ones they printed back then.  This is bad.


The first session of Pause that I attended was lead by Vince Menanno of Beezwax.   I mentioned the giant brains in attendance that week — my gosh, his alone would fill the hotel and the Heinen’s next door.  The topic was auditing — tracking data changes.


If you’re not familiar with the concept, here are a couple of links to browse:


http://www.seedcode.com/filemaker-audit-log/


http://www.nightwing.com.au/FileMaker/demos8/demo809.html



Audit Logging in FileMaker from DB Services

Some take aways from the discussion:



get (modifiedfields) was a rabbit hole  (I’ve been trying to use mod dates to capture records that have been altered, but just by exporting them, the dates update.  So I will be importing them every month forever.  The solution:  the export script needs to include a step adding the current date to an ARCHIVED ON  field.  Then trigger clearing that field On Record Commit.  Then I can just archive records with an empty field.
They explored Perform Script on Server, generating a long excel export on server and putting the file in a container that the user can export.
Using popovers improve the user experience when combined with transactions and perform script on server.

But this . . . this suggestion that was, as I said, practically a side note, is my love.


Build a data entry layout with all your calculations, Vince said, but make them global fields.  Once the user commits, sweep up all that data, pop a new record in the  and insert the values as static numbers.


Isn’t that brilliant?  Isn’t that beautiful?  The user never accesses the actual data.  If the record isn’t completed, it’s never created at all.   The calculated fields don’t get to live in your solution at all, weighing it down with all their pluses and minuses and gets.


I’ve been doing this for complex finds for a while — sending the user to a separate layout with global fields and grabbing those values as variables for a search.  But it never crossed my mind to build the major part of the interface like that.


A single, far too complicated imo, layout is the primary user interface for my client’s big database.  So as soon as I get a chance, guess what I’m going to be doing?


 


 


 •  0 comments  •  flag
Share on Twitter
Published on May 04, 2016 09:10
No comments have been added yet.