Tag Archives: Database

FileMaker: One to Many Reports

FileMaker is great for putting together a quick form for any number of data-entry chores.  By using a FileMaker portal you can put together a master/detail form that will keep track of transactions based on some kind of header record.  Typical examples include:

  • Invoices and line items
  • Customers and interactions
  • Prospects and sales efforts.
  • Jobs and application sequence.

These forms are especially helpful when you need to keep track of a “pipeline”.   For example when applying for a job, there are multiple steps involved:

  • Applied for the job
  • Received an eMail acknowledgement
  • First phone interview
  • Second phone interview
  • Scheduled live interview

At any one time you may have multiple jobs somewhere in the process, and a typical status report would show a list of each job and its current status.

This brings us  back to the question of displaying the jobs and activities. We’re looking for a “report” which has a list of jobs, and underneath each job the list of activities that have taken place for the job.

JobHunt FileMaker portals display the related records for each master record but a portal almost by definition shows only a specific number of transactions at a time in a scrollable window.  For a printed report, where we want to see all of the related records, we need to define the report without the portal.  This is done using FileMaker’s “sub summary” band when creating the report layout.  The trick is to start defining the report using the transaction table as the basis of the report displaying the fields in the body band and referencing the master table in the sub summary band.  When setting this up,  it looks something like this:

JobHunt2.png

Note the report bands on the extreme left,  with the body band at the bottom, the sub summary above it,  and the report header band at the top.   One clue that the basis of the report is the transaction table is that at the top of the screen, it shows “Table:Transactions” .  Also, the reference to the related field in the subsummary band are prefaced with the double colonon.

The ::RprtHeader  field is actually a calculated field which consists of the employer’s name, and the job being advertised by the employer.  This solves the problem of applying to more than one job at a single employer, in that it effectively provides a unique field name to display the transactions for just one job at a time.

 

 

MailChimp: Data mining your subscriber lists.

MailChimp Logo

To find out more about your MailChimp lists, create a segment.

I’m not sure why it took me so long to figure this out, (just dumb, I guess..) but MailChimp actually has a pretty good built-in querying ability directly from the management interface.  It involves the segmenting function, where you create subsets of your list.  MailChimp calls these subsets segments, and the classic use for this is to break up a large list so that you can test different segments by using different subject lines, or mailing times.

From a database perspective, it looks like this:

MailChimp vs. Database
create a segment = create a query
segment = query results, aka a “cursor”
segmenting options = query criteria, aka  an SQL WHERE clause
saved segment = saved query results

In SQL, this would be the equivalent of:

SELECT * FROM <my eMail list> WHERE <my criteria> INTO <my segment>;

The available criteria are fixed, but there are a lot of useful ones. You can combine up to five criteria in a single segment request.  For example, let’s say you want to see how your list is performing. You can query how many subscribers opened:

  • all of your last five campaigns
  • one or more of the last campaigns
  • none of your last campaigns

The criteria are chosen from a convenient drop-down list.

Mailchimp Segment Drop-Down

Mailchimp Segment Drop-Down

To see the results of this query,  click on the  “Preview Segment” button at the bottom of the dialog box.

MailChimp - Segment Results

MailChimp – Segment Results

One thing you may note in the listing above, is a field called “Grade Level”.   We include this field on our MailChimp sign-up form. It will be populated only if we acquired the user through that form and if they choose to give us that information. We also ask for zip code.

The “Contact Rating” field, with the stars, rates the quality of the contact based on their campaign activity and the length of time that they have been on the list. Oddly enough, new acquisitions start out with two stars. If they fail to respond to several campaigns, then they are demoted to one star. These stars are the basis of determining how to pare down your list; eventually you might consider removing 1-star contacts altogether, or sending them a “re-engagement” eMail beforehand. This is well documented on the MailChimp web site. To cut to the chase…  4 and 5 star members are engaged, 3 star members either have low activity, or haven’t been on the list long enough to earn a higher rating.

 

HeidiSQL+pLink+putty=Joy

Logo for HeidiSQL, a slick GUI front-end for mySQL

After manually changing a hundred blog posts imported with another theme from “published” to “draft”, I figured it was time to actually look at my WordPress database, since we may wish to do some global link updates,  once we get all of the media imported from another blog.  One of the best tools for this on Windows is the wonderful HeidiSQL program.

My Ubuntu server which hosts mySQL wants an SSL connection to accomplish this, so SSL must be used with HeidiSQL. This is done by using a intermediate program called plink which sits between HeidiSQL and Putty (the terminal program for accessing the Linux command line).

I found an explanation of how to use pLink with HeidiSQL.  However, if you can reach the command line using Putty and an SSL connection on port 22,  then you don’t have to do the first part of the instructions, because you already have the server’s certificate installed on your machine. It was cool to be able to verify this in the Windows registry by looking at the registry key.  And then, I was in.

heidisqlscreen

 

 

FileMaker 15

filemaker

FileMaker has updated to version 15 for all platforms. This version includes a ton of bug fixes, heightened security, and some internal changes, as opposed to visual changes. In fact it almost looks and acts the same as version 14.

There is a fair amount of grousing going on in the FileMaker forums about the paucity of new features and (yet another) increase in the effective price. I’m afraid that FileMaker is pricing themselves completely out of the low-end market, although a single copy can be had for education or nonprofits for $197 on Amazon. (Regular price $329).  This is a perfectly respectable deal, and you might want to consider at least one copy for end-user database needs especially if you use Macs, or loath Microsoft Access. FileMaker is my database of choice for front-ends for mySQL (via ODBC), managing eMail lists, and creating tables and inspecting data of all kinds.

FileMaker comes with some pre-built database applications called “Starter Solutions”. Some of these  have been updated for version 15.  I did my expense reporting for a recent trip in the Expense solution, and it makes an attractive listing sorted by categories of all your trip expenses.  Here’s the data entry screen.

Screenshot_051716_035903_PM

Here’s the report:

Expense Report

The application can be hosted on FileMaker server and accessed in a web browser, or run within a standalone copy of FileMaker on a Mac or PC, or run on an iPhone or iPad using the free FileMaker Go app.

There is provision for storing an image of each receipt.  If you run the application on an iPad, you can snap a picture of a paper receipt, or enter a bar-code.  Pretty slick!

 

Brightpearl API: Add UPS Tracking Numbers

We have now been using our web store for about a month, and for the most part things have been going pretty smoothly. One issue has been sending orders to our warehouse, and we’ve got a pretty good Powershell script that creates a comma delimited text file (.csv) of order numbers and address information from queries to the Brightpearl API. This file is sent daily to the warehouse via FTP, and warehouse staff  import the orders into their UPS Worldship program.

Screenshot_042715_044611_PM

The second half of this saga is to obtain the UPS Tracking number for each shipment. Once the shipment has been processed in UPS Worldship, a tracking number is generated and stored in the UPS Worldship record for that shipment. Worldship has an export function which will add the tracking number to a .csv file of order numbers and tracking numbers that we can use to update the order record in Brightpearl. The structure of this file, (which is completely customizable) is:

Order Number – In our case it is the Brightpearl sales order number
Tracking Number – from UPS. These look like “1Z 041 388 03 8331 4101”
Expected Delivery Date.
The .csv file looks like this. (UPS loves long field names).

ShipToCustomerID,ShipmentInformationLeadTrackingNumber,ShipmentInformationDeliveryDateTransitTime
"100064","1Z0413880373533722","20150429"
"100020","1Z0413880373302132","20150504"
"100068","1Z0413880373810940","20150430"
"100074","1Z0413880374436157","20150430"

The next step is to walk through the .csv file, find an order number, and update the custom field PCF_TRACKING in Brightpearl to contain the tracking number. Here is the Powershell call to update a single record:

PS>$BPOrders=Invoke-RestMethod `
 -Uri http://ws-use.brightpearl.com/public-api/myBPAccount/order-service/order/100541/custom-field `
 -Body $body `
 -Headers $headers `
 -Method Patch

There are a couple points of interest here. For the most part it is “standard” Powershell syntax for the Invoke-RestMethod.
1. We invoke this by assigning the results of the API call to $BPOrders
2. The call has several lines; the line continuation character is a accent aigu or back-tick.
3. Note that this query uses a $headers variable which includes the two validation properties for the Brightpearl query: , the name of the application and the security token for the application. These are stored as a hashtable.

PS>$headers
brightpearl-app-ref myappreference
brightpearl-staff-token mystaff-tokenXYZ123

More on obtaining the authcode here.

4. The $body variable is also created as a hashtable, but then converted to JSON, and placed between square brackets. This variable contains three parameters, the operation that you are performing on the record, the field that you want to modify, and the value that you want to put in the field. The syntax below simply says, “Replace the contents of the /PCF_TRACKING field with the value of 12345”.

$body=[ordered]@{"op"="replace";"path"="/PCF_TRACKING";"value"="12345"}
$body=($body | ConvertTo-JSON)
$body=("["+$body+"]")

The result is:

PS>$body
[{
"op": "replace",
"path": "/PCF_TRACKING",
"value": "12345"
}]

5. The -Method parameter is a “Patch”. This allows you to replace the contents of a single field in a record rather than replace an entire record as happens when you use PUT.

6. Finally note in the Invoke-RestMethod call, the URI contains “custom-field”. This is a literal, it isn’t the name of your custom field. The name of the custom field is contained in the body.I In the example above, it is “/PCF_TRACKING”

The above API call will replace the contents of a single field in a single record. The next step is to be able to loop through the .csv file, and for each record, find the corresponding record within the Brightpearl database, and update its Tracking number field.

Oh, one more thing, the results of the operation are contained in $BPOrders. The API actually returns the contents of ALL custom fields. You can choose which ones you want to see using dot notation.


PS>$BPOrders.response.PCF_TRACKING
12345

Powershell: Basic Address List Processing

One of the great things that was always a little bit fun (well, for some us… what passes for fun) is processing lists using Unix/Linux shell scripts and tools. When you are in practice, you can perform miracles; leap tall buildings in a single bound. Let’s see what we can do with a file using Powershell commands.

I received a file of “lapsed” donors. These are donors to our organization that gave to one of our campaigns in the past, but haven’t given recently. We know they were our friends in the past, and we think they still are, so we’d like to contact them for a one-time mailing, and/or add them to our master mailing lists.

The file is named “lapsed.csv”. The csv extension suggests this is a text file with a “comma separated values. And indeed, if I look at this file, at the powershell prompt, it is easily visible.

PS C:UsersLarrypowershell> cat lapsed.csv

This shows a comma delimited file with a header line:

Addressee,first,spouse,Organization,Street,City,State,ZIP
Joe Blow, Joe,,,123 W 57TH St Apt 123,New York,NY,10019
Jill Smith,Jill,Howard Services,123 Poor Farm Rd,Colchester,VT,05446

Our standard is:

Org, fname, lname, address1, address2, city, state, zip

So, among other things, we’re going to want to change the order of the information in the fields, as well as the field names.

First thing to do is to import the file into a single PowerShell variable, and then see what we’ve got.

PS>$lapsed= Import-CSV lapsed.csv
PS>$lapsed

Addressee : Joe Blow
 first : Joe
 spouse :
 Organization : Paul C Bunn Elementary
 Street : 123 W 57TH St Apt 123
 City : New York
 State : NY
 ZIP : 10019
Addressee : Jill Smith
 first : Jill
 spouse :
 Organization : Howard Services
 Street : 123 Poor Farm Rd
 City : Colchester
 State : VT
 ZIP : 05446

Using import-CSV, the file is converted into a series of custom objects with members that correspond to the existing field names …so further manipulations can be done using object manipulations, instead of just a bunch of searching and replacing. (Well that’s the theory anyway).

We can find out the number of records we have by looking at the count attribute.
 PS>$lapsed.count
 553

And we can see the “members” or field names of each record by using Get-Member

PS>$Lapsed |Get-Member

TypeName: System.Management.Automation.PSCustomObject
Name MemberType Definition
 ---- ---------- ----------
 Equals Method bool Equals(System.Object obj)
 GetHashCode Method int GetHashCode()
 GetType Method type GetType()
 ToString Method string ToString()
 Addressee NoteProperty System.String Addressee=Joe Blow
 City NoteProperty System.String City=New York
 first NoteProperty System.String first=Joe
 Organization NoteProperty System.String Organization=Paul C Bunn Elementary
 spouse NoteProperty System.String spouse=
 State NoteProperty System.String State=NY
 Street NoteProperty System.String Street=123 W 57TH St Apt 123
 ZIP NoteProperty System.String ZIP=10019

Ok…we knew the field names before by just looking at the raw .CSV file. But now we have seen how the Import-CSV command converts the .CSV file to an array of objects with a type of PSCustomObject.  Since each address record is an object, the way we manipulate it is to use object methods.

1. Add a lname field for the last name
PS>$lapsed | Add-Member -Name “lname”

That takes the $lapsed table and pipes it to the Add-Member cmdlet. This adds the the member to EACH object in the table, rather than adding it to the table itself.

2. Add a fname and lname fields for the first and last name
PS>$lapsed | Add-Member -Name “fname” -MemberType NoteProperty -Value “”

PS>$lapsed | Add-Member -Name “lname” -MemberType NoteProperty -Value “”

Now the fields look like this:
Addressee : Joe Blow
 first : Joe
 spouse :
 Organization : Paul C Bunn Elementary
 Street : 123 W 57TH St Apt 123
 City : New York
 State : NY
 ZIP :
 lname : ""
 fname : ""

3. Copy data from the first name and organization fields to their new fields.
PS>$lapsed | ForEach-Object ($_.fname) {$_.fname=$_.first}
PS>$lapsed | ForEach-Object ($_.org) {$_.org=$_.Organization}
This leaves us with a record looking like this.

Addressee : Joe Blow
 first : Joe
 spouse :
 Organization : Paul C Bunn Elementary
 Street : 123 W 57TH St Apt 123
 City : New York
 State : NY
 ZIP : 10019
 lname :
 fname : Joe
 org : Paul C Bunn Elementary

6. Having copied the data from the old fields to the new ones, we can delete the old fields.
There isn’t a cmdlet to remove an object member, so the you have to use a different nomenclature. (Note To Self… opportunity to make a custom cmdlet?)

PS> $lapsed | ForEach-Object ($_){$_.PsObject.Members.Remove(‘Organization’)}
PS>$lapsed | ForEach-Object ($_){$_.PsObject.Members.Remove(‘first’)}
PS>$lapsed | ForEach-Object ($_){$_.PsObject.Members.Remove(‘spouse’)}

Now a typical record is starting to look much more like what we want it to look like.

 Addressee : Joe Blow
 Street : 123 W 57TH St Apt 123
 City : New York
 State : NY
 ZIP : 10019
 lname :
 fname : Joe
 org : Paul C Bunn Elementary

7. We still need to pick out the last name from the Addressee field.
There might be a couple approaches to this using regular text search methods:
a. Given a string “Joe Blow”, we could find the first blank character, and then take anything to the right of if as our last name.
b. We could start at the right hand side and count backwards until we get to a space.
c. If there are word functions, we can choose the right-most word in the string.
d. Use the split function. This is what we’ll use.

PS> $lapsed | ForEach-Object ($_){$_.lname=$_.Addressee.split()[-1]}

8. Finally, we can eliminate the “Addressee” field
PS> $lapsed | ForEach-Object ($_){$_.PsObject.Members.Remove(‘Addressee’)}

Street : 123 W 57TH St Apt 123
 City : New York
 State : NY
 ZIP : 10019
 lname : Blow
 fname : Joe
 org : Paul C Bunn Elementary

9. Time to export back to a CSV file.
PS> $lapsed | Export-Csv -Confirm -Path “C:UsersLarryPowershellulapsed.csv” -NoTypeInformation

10. Almost done. The one frost is that the field order isn’t exactly as I’d like. This can be fixed with the Select-Object cmdlet.

PS > $lapsed | Select-Object -Property fname,lname,org,street,city,state,zip |
Export-CSV -Path “C:UsersLarryPowershellulapsed.csv” -NoTypeInformation

Recall when typing a string of commands with a pipeline, the pipe delimter will also act as a “newline”, so you can break the command up over the course of a couple of lines and have the full pipeline execute as one command.

Notes:

As they say on public television: “Many thanks to the following:”

“jrv” on Microsoft Technet
http://goo.gl/1Kyw07

“Root Loop” on StackOverflow
http://stackoverflow.com/questions/22029944/batch-or-powershell-how-to-get-the-last-word-from-string

More Info:
http://windowsitpro.com/powershell/csv-excel-or-sql-it-doesnt-matter-powershell

More about the split function in Powershell help
PS> Get-Help about_Split

BrightPearl API Part V – PowerShell

PowerShell, is a Microsoft download, currently at version 4.0.  It is a batch command language and enhanced command shell which is the current successor to the CMD.exe found in Windows. It seems to be a bit of a mash-up between the old DOS command line, the Unix-style shells like BASH, and includes connections to .NET objects. Unfortunately, it also abstracts a number of parameters to objects, such that you can’t just put stuff on a command line; you have to assign it to an object parameter when passing parameters, or when getting things out from a returned object. 
 
To continue with our Brightpearl example:
Recall that when working with the Brighpearl API, you have to first obtain a temporary authorization code, which is good for roughly 30 minutes. You do this by making an HTTP POST which includes your credentials in the body of the post.
 
Obtain a Brightpearl Authorization Code:
 
This is a two step process:
 
1. First stuff the authorization credentials into a variable. Note that the credentials are formatted in nested JSON notation.
$bpauth = “{apiAccountCredtentials:{emailAddress:”myemail@mycompany.com”, password:”Mypassword”}}”
2. Execute the command line call using the stored log-in credentials in the Body parameter
http://ws-use.brightpearl.com/mybpaccountname/authorise -Body $bpauth -ContentType application/json -Method Post
The Brightpearl server returns an authorization code which is displayed as an object with a series of parameters. The authorization code is contained in the a JSON string in the “Content” parameter.  This is the authorization code that must accompany any subsequent call to the Brightpearl API.
StatusCode        : 200
StatusDescription : OK
Content           : {“response”:”53145c429-x1xx-y2sf-z34a-8abc9cde96f9gh”}
RawContent        : HTTP/1.1 200 OK
                    Pragma: no-cache
                    ruid: a745efb3-2414-428f-8427-5001e3c810b8
                    Connection: keep-alive
                    Content-Length: 51
                    Cache-Control: no-cache, must-revalidate
                    Content-Type: application/json;char…
Forms             : {}
Headers           : {[Pragma, no-cache], [ruid, a745efb3-2414-428f-8427-5001e3c810b8], [Connection, keep-alive], [Content-Length, 51]…}
Images            : {}
InputFields       : {}
Links             : {}
ParsedHtml        : mshtml.HTMLDocumentClass
RawContentLength  : 51
To isolate the code itself, you have to assign the output of the above command to a variable, and then access it using dot notation.
So, restate the above command to assign the output to a variable:
$bpcode = Invoke-WebRequest -Uri http://ws-use.brightpearl/microdesign/authorize -Body $bpauth -ContentType application/json – Method Post
If you then execute $bpcode.Content, then you’ll get the authorization code, (again in JSON format).
PS C:UsersLarryPowerShell>$bpcode.Content
This returns:
  {“response”:”53145c429-x1xx-y2sf-z34a-8abc9cde96f9gh”}
To see this in a more readable format:
PS C:UsersLarryPowerShell>$bpcode.Content | ConvertFrom-Json
response
——–
53145c429-x1xx-y2sf-z34a-8abc9cde96f9gh
To assign the authorization code to a variable itself, we either have to strip it from the JSON code, or from the returned converted version.  It probably is easier to convert via the JSON code because we can do a text search everything after the colon, and then strip off the double quotes and the last curly bracket.
$bpstring=$bpcode.content
The following command strips off everything up to the start of the actual code.
 
$bpstring = $bpstring. Trimstart( “{`”response`”`:” )
It gives us:   
“53145c429-x1xx-y2sf-z34a-8abc9cde96f9gh”}
The following command strips off the final quote and curly bracket.
$bpstring = $bpstring. Trimend( “`”`}”)
This gives us our final result; what we’re really looking for:
53145c429-x1xx-y2sf-z34a-8abc9cde96f9gh

Final Script:

The final script is pretty close to the interactive commands entered at the PowerShell command line. The main exception is the addition of the back tick escape character within the script. If you use the PowerShell Interactive Shell as an editor,  it will color code things nicely and flag scripting errors. 
# BrightPearl API: Get an authorization code for subsequent API queries
# Note escape character is the “`” (back tick), instead of the usual backslash.
# Double quotes need to be escaped when nested inside.
# LK 10/9/2014
 
 
# Assign credentials to an authorization object.
$bpauth = “{apiAccountCredentials:{emailAddress:`”myeMailAddress@mycompany.com`”,password:`”myPassword`”}}”
 
# Execute the HTTP POST to retrieve the authorization code. The result is assigned to the string $bpstring
# Note use of the back tick as a line continuation character
$bpstring = Invoke-WebRequest `
-Body $bpauth `
-ContentType application/json `
-Method Post
 
$bpAuthCode =$bpstring . Content
$bpAuthCode =$bpAuthCode . Trimstart( “{`”response`”`:” )
$bpAuthCode =$bpAuthCode . TrimEnd( “`”`}” #Note escape codes for the search expression
 
# Print the Authorization Code
$bpAuthCode