All posts by David Zeidman

BBCon 2024 – Top Sessions to Attend

Every BBCon I like to take a look at the sessions on offer and suggest the best ones to attend. This is partly as a reminder to myself as to what I should see but also a thumbs up to all those talented speakers out there who put a lot of work into bringing great content to BBCon.

If I have missed someone, then I apologise in advance. This is, after all, my list but convince me otherwise and I will attend your session too!

One further note, there are very many sessions talking about the Microsoft Power Platform. I appreciate it is very useful for automating without the need to write code but it really is not my area of expertise. Because of that, I have not recommended any such session but do check them out.

In no particular order (apart from my session which, quite rightly, comes first )

1. Trail Tales – How auditing your data in Raiser’s Edge can tell a story about your supporters and those that manage them

Have you wondered about Audit Trail? How can it help your organisation uncover data entry mysteries? What can it do to support training and improve efficiency. This session will tell all.

I will also be revealing a new major piece of functionality!

More info

2. Ask the Super Users: Raiser’s Edge NXT User Panel 3.0 – Part 1 and Part 2

At every BBCon I have attended there has been this type of panel. (Indeed quite a few years ago I sat on it). Last year saw this panel shine in answering such diverse questions ranging from Raiser’s Edge best practise to most efficient integration techniques. What is always great about this session is that, there is always somebody on the panel who can answer the question and, often there is more than one answer. Big shoutout to Carlene and Sunshine but, in fact, all the panelists are great!

Part 1 Info

Part 2 Info

3. Preview Query in Raiser’s Edge NXT Web View

There has been a lot of excitement about query in NXT. We all took for granted the fact that your could make a selection from almost any field in Raiser’s Edge when we had the database view. Comparing query to the equivalent functionality in other systems and it is obvious that it is a really powerful tool. Get the details of the new web view version of our classic favourite.

David Springer leads this session as he does another similar session Blackbaud Raiser’s Edge NXT® Web View Query: All-New Uses of a Classic Database Tool

With Chimpegration we had to work with the List module. How that query has been released we are also excited to make use of it within Chimpegration.

More info

4. 6 MORE Fundraising Automations That Will Blow Your Mind

The team at Prenger Solutions are experts at working with Raiser’s Edge and finding innovative ways to integrate systems. I am excited to see what automations they have been working with and am certain many if not all will, indeed, blow my mind!

More info

5. Cybersecurity for Low-coders

Even though this blog and my skillset comes very much from a high-code or pro-code background, I am sure that there is a lot that can be picked up from this session. Cybersecurity applies to us all low, high or pro coders. Ben Wong is such a great speaker making complex topics easily understood. With Ben at the helm I am sure there is much to be learnt.

More info

6. It’s the Little Things 2.0: 50 More Ways to Increase Productivity in Blackbaud Raiser’s Edge NXT®

I love this kind of session. There is always something to be picked up that I didn’t know before hand. I regard myself as quite knowledgeable in using Raiser’s Edge but nevertheless I am always amazed by how many techniques there are out there for achieving different results or new and innovative ways to use our favourite technology. I have worked with Judith over the years and am always very impressed by the things she suggests.

More info

7. Optimizing Gift Entry Flow

What BBCon session list would be complete without a session by Bill Connors. Bill and I go way back when I first started out working with Raiser’s Edge inspiring me on my journey. Bill is the author of “Fundraising with The Raiser’s Edge” giving advice way beyond that of a simple manual.

Whenever Bill is a speaker, he offers a masterclass in clear communication and expert insight. His deep understanding of the subject matter is evident throughout, and his ability to convey complex ideas in a digestible manner is truly impressive.

For anybody working with gifts in Raiser’s Edge this is an essential session.

More info

Working with the SKY Query API in Chimpegration

Blackbaud recently added the SKY Query API for Raiser’s Edge and Financial Edge. At first I was not really sure how this would be of benefit to our products. We have worked with Lists in Raiser’s Edge and on the database view we have generated static queries of records that have been processed. I never thought that we had a need for adding criteria to queries.

Now that I have seen the new Query API, I have been inspired.

How have we got around the lack of a query API so far?

In Chimpegration we push data from Raiser’s Edge to Mailchimp. In order to decide which records to push, we let the user select an NXT list that has previously been created. There are some issues with this.

  • Firstly, the list selection criteria is limited. The user cannot, for example, specify that a constituent with no email address or a blank email address should be ignored.
  • The list functionality does not allow you to choose specific output fields. We offer a limited range of fields that we think that the user may want to export.
  • The lists are static. If you want to update them, you can, but this prevents data being pushed to Mailchimp according to a schedule. (There are some workaround involving Queue but these are awkward).

How does the Query API solve this?

The query API allows you to programmatically list all queries and to load one of them in particular. You can then run the query and fetch the results. This makes it fully dynamic. Scheduled data exports to Mailchimp would be up to date with the latest information.

The user can choose the output data. Whereas previously they have been limited to the areas we offer the user, now they can choose any value available to them in a query. Most organisations won’t want to push a membership attribute or a gift notepad to Mailchimp but there is bound to be one out there that wants to do something like that or some other option that we had not considered. With the full range of output fields they are no longer restricted to what we offer them.

The same goes for criteria. Previously the user was restricted to the fields available in lists. Now the whole range of query fields can be a part of the criteria. If an organisation only wants to export constituents attending a specific event with a large t-shirt size, now they can!

What else can we do with Query API?

In Chimpegration Classic and in Importacular we look up constituents with criteria sets. It has been much simpler to look up constituents with a wide range of criteria. There is some scope for this based on the constituent list API endpoint. However, the Query API really gives this some muscle.

At first I thought that, even if we could do it, it would not be practical to create a query each time the user wanted to use criteria to search. We would end up creating a lot of queries saved in Raiser’s Edge. It is not certain that the user would have rights to delete a query. It just felt wrong.

However, the Query API includes the ability to generate a query on the fly. When including the filter, output fields and sort values in the json payload, the query can be run without explicitly saving it to the organisation’s environment. This is a real game changer.

The SKY API documentation does suggest that this should not be used instead of the constituent list and constituent search endpoints as they are optimised for search. However being able to search on more obscure areas of Raiser’s Edge certainly adds a lot of power to look up records that was previously missing.

When are we releasing the new Chimpegration functionality?

Update: This has now been released!

We are actively developing this functionality. However the Query API is still in preview so it is uncertain when this will be released. We may release it also with the caveat that our functionality is in preview and may break at any time (due to changes in the Query API). This is a matter of a few weeks though so watch this space for a demo!

Importacular and Regular Expression Transformations

Some Preamble

Regular Expression are really complicated. Even now I find it difficult to get my head around them. If you are new to them check out these two sites:

https://www.regular-expressions.info/ – a great tutorial and reference

https://regexr.com/ – a really good “playground” for testing your regular expressions

Overview

Importacular offers the user the ability to transform incoming data from one value into another. When we first started out this was simply a “from” value and a “to” value and if the incoming value matched the “from” it would change it to the “to”. That was very simple but effective. We soon realised that more power was needed so Importacular added partial matches or word matches (and clarified that the original was an “exact” match).

We also then added different replacement types too. These were “Complete” and “Partial” and later “Append” and “Prepend. If you selected “Complete” then all of the incoming value was replaced with the replacement value. If you selected “Partial” then only the matched part would be replaced keeping the remainder of the original value. “Append” and “Prepend” would add the replacement text to the end or the beginning of the original respectively.

Then we added RegEx – firstly for matching and then for replacing. The rest of this post describes how that works.

Matching

Importacular loops through each row in the data transformation grid and continues through each row unless the stop processing flag has been set.

If you choose a match type of RegEx you can put your RegEx in the “From Source” cell and Importacular will try and match on it. For example if you use this very simple RegEx:

.*

Importacular will match on any number of any character i.e. it will always match on what is found.

If you use this RegEx:

B[a-z]g

It will match on “Big”, “Bog”, “Bag” and also “Bkg” (as well as every letter from a to z).

If it finds a match it will try to replace the value

Replacing

When you replace using RegEx there are two thing to note.

  1. It does not matter how the match was made. It could be a RegEx, a complete, a partial or a word match. Replace is independent of how the match was made.
  2. Importacular does not use the classic replace mechanism of RegEx i.e. create a capture group often using parenthesis or sometime slash and parenthesis and then reference that group with a dollar e.g. $1 or $2. Importacular does not use this method!

Importacular’s replace works like this. It takes the incoming value and applies the regular expression to it in order to extract a value. That value is then used as the replacement text. For example if the incoming value is:

2022 Annual Appeal

We can extract the year by using the regular expression:

^20[0-9][0-9]

(Note that there are a number of different ways you could get the same information out using a RegEx. This is just one of them)

Say I have a US phone number and I want to get the area code. The phone number is in two different formats e.g. (415)-123-456 or 415-123-456. I can extract the area code using the following:

(?<=()[0-9]{3}|^[0-9]{3}

If I want to be really clever, I can use a second row in my transformation to transform the area code into the city. In this case after extracting “415” I would transform it to San Francisco.

Conclusion

The hardest part of using regular expressions in Importacular really is the regular expression itself. I won’t try to convince you otherwise. Hopefully this post will make it easier to use those regular expressions once you have determined what you need. Use the RegExr site (link at the top of this post) to test your matching and replace extraction before you put it into the transformation grid. Once in the transformation grid you can also check the review screen to see if it has worked as the resulting value will show up there transformed if everything has worked as expected.

Audit Trail Cloud doubles the areas tracked

Today we released a new version of Audit Trail Cloud and what a release it is!

New Tracked Areas

We have doubled the number of areas tracked. As well as constituent, gift, address, email, phones, online presence, prospect and solicit codes, you can now also track actions, relationships (both individual and organisational), constituent codes and constituent custom fields.

We said that as soon as Blackbaud released new webhooks we would release new areas to track additions, changes and deletions and that is exactly what we have done.

To complement the new areas, we have also added two new tiles. When looking at a specific action page or a specific gift page in the Raiser’s Edge web view, we now include a tile that shows the history of changes for that action or gift respectively. (You only see this if it has been turned on for your login in Tile Security under the Configuration area of the Audit Trail Viewer)

And More Too!…

A while back Blackbaud added the functionality to see who made a change to the record. This was a much sought after addition but it was not available for all records. The changed by id only comes through for constituents, gifts and addresses. This meant that for all the other areas, we were forced to simply write “Not Available”.

As part of this release, if a user has made a change to, say, a constituent and then to an email address, we will infer the changed by user from the constituent change by data. This is not an exact science as, in theory there are several possibilities. It could be that one person is changing the constituent and, unlikely as though it may seem, another goes and changes their email address. Alternatively the same person changes the email address before the constituent record. (Or the change is made so close together that the webhook for the email is fired before the webhook for the constituent even though the constituent is changed first). In this case we will still write “Not Available”

To make it clear, for inferred changes, the changed by user is given in italics and hovering over it will show the tooltip that the value is inferred.

Upgrade (or Purchase) Now!

If you are an existing user, you will be prompted to upgrade the next time you go into the Viewer. Just go into ZeidZone to download the latest configuration version.

If you are not an existing user then what are you waiting for?! Get in touch now

Adding a primary constituent code

A SKY API task that is common but not obvious is adding a primary constituent code to a constituent record.

I say that it is not obvious because adding a first constituent code automatically makes it the primary. However adding a second does not change the first.

One way of doing it to remove the existing constituent code and add the second one followed by what was the first one. This is a highly unsatisfactory workaround.

The solution lies with the ‘sequence’ field. When creating your constituent code, set this value to 1 and this code becomes the primary.

Here is a sample payload :

{
  "constituent_id": "11278",
  "description": "Volunteer",
  "end": { "d": 13, "m": 11, "y": 2023 },
  "start": { "d": 12, "m": 01, "y": 2022 },
  "sequence": 1
}

This will put this constituent code at the top of the list and it will become the primary.

As an aside, working with constituent codes, it would seem possible to PATCH a constituent code description. After all you supply the id for the code, it seems reasonable then that you could simply change the code description. However, this is not possible. You do not get any kind of error message. Indeed the response is that everything has worked as expected. (but no change of code value)

An Audit Trail Update

When we released Audit Trail Cloud we were not able to show the name of the person who made the change. This has been a big issue. However recently Blackbaud updated their webhook API. For some (but not all) webhooks, Blackbaud now send through the id of the person that made the change.

So what’s the problem?

Firstly not all the webhooks send through the person who made the change. At the time of writing, only the constituent, gift and address webhooks include the changed by id. This leaves the records such as email and phone without this information.

What is more, we are only given these for add and change webhooks but not delete. This leaves us not knowing who deleted records.

Which id are we given?

This is a curious one. We are given the database view id for a user. This id is not obviously visible anywhere in the application (You can see it in query if you look at the SQL). In order to convert the id to a username we have to use the new (at the time of writing) NXT Data Integration API. The Get Information about a User method let’s you get user detail based on the id.

One caveat with this whole API is that it requires the user calling it is an environment admin.

What’s Next For Audit Trail?

We know that some new webhooks are coming soon and as soon as they do we will incorporate them.

We are also looking at inferring the changed by user. If a change is made to a constituent and then soon after a change is made to their email, we infer that the same person made that change… Coming soon!

Audit Trail Cloud

I have not posted for a while (in case you missed it there was a global pandemic). However I have saved myself for a great announcement. As you have perhaps read, we are about to release Audit Trail Cloud.

Where did it come from?

Audit Trail Cloud is based on the concept we introduced when we developed Audit Trail Professional many years ago. Back then Raiser’s Edge was only available in what is now called the database view. It made use of VBA (Visual Basic for Applications). Whenever a record was opened, AT Pro would take a snapshot of the field values and when it was saved it would compare the changes, saving the difference to the database.

When organisations moved to Blackbaud hosting they lost the ability to make use of AT Pro. We were not allowed to use VBA on the hosted platform. Many of our clients were sad to lose such a great application others were shocked that they would not be able to take it with them. All of these clients and more were hankering for an Audit Trail that worked with NXT and the SKY API.

So how is Audit Trail Cloud Different?

We had a number of challenges when approaching Audit Trail Cloud.

In the beginning we could not do it

In the beginning we just could not do it. There was no simple way of knowing if a record had changed. Of course we could poll RE NXT to see which records had changed recently but that was not really a viable solution. 

Later on we could do it… just differently

Along came webhooks. Webhooks told us when a change was made. This was just what we were waiting for. However, webhooks did not tell us exactly what had changed and we did not know what the previous value was. To get around this, during the setup, we take an initial snapshot of those fields that we are expecting to receive by way of webhooks. We retrieve and store a baseline set of data so that we know the value of a record before it has been changed. At the time of writing this there are a limited number of webhooks, so we are not downloading the whole database. The areas covered at present include biographical, address, contact records and gifts. We track changes for some other areas but cannot extract the baseline data sets easily.

Could be awkward

A further consideration is that we do not want to be responsible for your data. Everybody knows how awkward a data breach can be. Lots of red faces all around. But worse is the fact that when data is compromised responsibility lies with the vendor. As a small company this is not a liability we were prepared to take on. We decided to give you full control of your data. Or least farm off liability to you and to another company that are much better placed than we are to handle security. All your data is stored in the cloud with AWS (Amazon Web Services). It is locked down from us. Unless you give us the password, we cannot access it.

But wait… there’s more

One great feature that we were definitely not expecting was the breadth to which the changes are covered. While at the moment there are  limited number of areas and fields, it seems as though they are covered in a lot more places. 

We thought that doing an NXT version of Audit Trail would only capture changes in NXT. However it also captures changes made in the database view. As well as that, whereas with Audit Trail Pro we had to implement a workaround to capture global changes, with Audit Trail Cloud those changes are automatically captured in the same way as any other change.

Viewing Records

In Audit Trail Pro we had the Audit Viewer. This was a grid where the changes were shown. You could filter the changes by date, record area and field. We have reproduced this, less the dour Winforms look of the early noughties.

We have also added a constituent tile. Going on to a constituent record, you can view the changes for that one record and see how it has been edited over time.

What do we see in the future?

So far we have been limited to the webhooks that Blackbaud have released. We are told there are more on the way, so as soon as they are released we will add them to our arsenal. Beyond that we hope to be able to add a revert option so that you can undo erroneous changes. We are also adding tiles for other records that have their own NXT space, such as gifts and other areas when they are released, such as actions.

One piece of functionality that we felt was essential but is not included in this first version is a record of the user who made the change. The webhooks just do not give us this information. We are told this is coming imminently so this is our number one priority as soon as it is released.

How do I find out more?

You mean this has not been enough information for you? Well you are in luck. Take a look at our webpage:

Audit Trail Cloud – Zeidman Development

Or sign up to a webinar about Audit Trail Cloud

Audit Trail Cloud Demo (clickmeeting.com)

SKY API and Postman

The SKY API documentation is very good compared to many APIs that I work with. One area that is particularly useful is the “Try It” area where you can test an endpoint with your own data. One small annoyance is that if you want to try a number of different endpoints, you need to go through the whole oAuth2 process each time.

Using Postman allows you to avoid this and also take advantage of a number of other features of that application. One nice feature is the ability to generate access tokens with ease so that they do not need to be refreshed each time you run an endpoint. (You will still need to refresh them after the allotted 60 minute life but that should give you plenty of time to run a number of endpoints)

This post shows you how to set up Postman for oAuth2 in the simplest way possible. I am not an expert in Postman and there may well be things that I have missed that could make the process even easier. Let me know in the comments if you do something differently!

Setup your application

It is probably wise to have a separate Blackbaud application for Postman rather than using an application that you use in production.

This is my very basic app. For Postman you do not need a live redirect URI but you do need one as the SKY OAuth2 process requires that the value you send in matches a value on your app.

Setup Postman

In my postman I have a collection of Sky API endpoint calls. You are able to add authorisation details to the main folder and have each endpoint inherit the credentials from that rather than having to enter them each time.

When I click on the Sky API link I can enter my credentials for the whole folder.

Hopefully most of the values are self-explanatory. You should start with the section “Configure New Token”.

Token Name: Just give your token a name so that you recognise it.

Grant Type: Authorization Code

Callback URL: This is one of the values that you have in your SKY app

Auth URL: https://oauth2.sky.blackbaud.com/authorization

Access Token URL: https://oauth2.sky.blackbaud.com/token

Client ID: This is the Application Id from your SKY app

Client Secret: From your SKY app

Scope: This is not currently used by the SKY API

State: You could put a value here but it has no real use in the context of Postman

Client Authentication: Send as Basic Auth Header

Press the “Get New Access Token” button. This will prompt you to log into Blackbaud and go through the OAuth2 process. It will then save a token in the Current Token area. This is then used by your calls.

Setup an individual endpoint call

Now that you have set up authorisation, you can proceed to try a call. You will still need to add your subscription id to the header as shown. (If anybody knows of a way to add that value on the folder level let me know!)

Put the URL in the address box and change the authorisation to inherit from parent as shown below

On the headers tab, add the bb-api-subscription-key

This can be found on your developer account here: https://developer.blackbaud.com/subscriptions/

Then you are ready to press the Send button to retrieve data from SKY API

BBCon 2020 – Session Recommendations

This year will be a very strange BBCon. Gone is the flight over, the anticipation, hoards of people all in one place, the discussion over the food, the inevitable “bacon” joke (does that joke ever get old?), the jet lag and of course, all that swag.

One thing still there though are the quality speakers and sessions. As with each year I am giving a round up of the most interesting speakers and anticipated sessions. These are all my personal opinion which is, of course, from a more technical perspective so if I have left anybody off the list that justifiably should be on then I apologise in advance.

So, in no particular order….

Utilize the Power of OData in Blackbaud Altru® to create Valuable Dashboards in Microsoft® Excel and Power BI – Carly Meek.

We make use of OData in Chimpegration Cloud for Altru. From a technical perspective it is one of the most important distinctions between the Altru development platform (Infinity) and the SKY API. OData allows any query that has been put together in Altru to be “streamed” to other platforms, in our case, Chimpegration but importantly applications such as Excel, Power BI and other analytic applications. Definitely one to watch if you need access to your data in a more advanced setting.

Nonprofit Analytics: How To Build Financial and Fundraising Dashboards – Thomas A. Evans and Linton Myers

This session is a great follow up to the session that I did last year with Graham Getty (From Crystal to Cloud. See the live demo part here). Where Graham and I looked at ways in which you could replace and enhance legacy functionality and move to the cloud, Thomas and Linton take another look at how those capabilities have changed and added some that Graham and I may well have missed. Definitely worth taking a look at.

Get Your Head in the Cloud – Eric Wand

As developers we sometimes forget life before the Cloud. The ease with which you can reach all your resources wherever you may be is something that web applications strive to help customers with. And yet I still find resonance with Eric’s opening sentence: “Remember when the internet felt expensive, untrustworthy and complicated?”. When I compare developing a plugin to developing a cloud solution this still rings true. If I still get nostalgia over desktop applications then surely less than technical users may well feel the same. This session is sure to convince you otherwise.

Open a world of possibilities with SKY Developer – Stu Hawkins and Ben Wong

I have seen some of the customisations that Stu and his team have produced and I am regularly impressed. The SKY API platform has come a long way from the early days and there is now so much more functionality available to developers to customise RE NXT. It will certainly be fascinating to see what new and exciting utilities Stu has to offer. If you are new to developing or want to get started make sure that you take a look at this session and quiz Ben who has all the answers!

Back by Popular Demand: NXT Tips and Tricks – Lisa Nurminen and Jarod Bonino

This session is becoming an annual favourite. Having missed it at BBCon 2019 in Nashville (the room was too packed to get into), I managed to see it at BBCon 2019 London. Now husband and wife team (sorry if I spoiled the worst kept secret outside of Blackbaud) are back. Lisa and Jarod know NXT inside out and have found ways of working with it that overcomes any drawbacks that you might find. I felt that this session was worth it even though I am not working with NXT on a day to day basis. You can be sure that they will have some new tips and tricks up their sleeves as well as showing the best ones from last year.

Mastering Security in Blackbaud Raiser’s Edge NXT® – Bill Connors

No session list would be complete without Bill Connors. Bill wrote the book on Raiser’s Edge (literally). It will be great to see a session looking at NXT security. While the database view is somewhat of a known entity, the new NXT modules are, for me at least, somewhat of a mystery. How you link that functionality with good organisation policies will be essential viewing for anybody managing a database.

Nothing takes your fancy? Well here are some of our favourites from past years:

Troubleshooting the creation of a SKYUX Addin on Windows

I just came back from BBCon in Nashville all fired up ready to create a SKY UX Addin. For those of you new to this, this can be a tile in a SKY based application i.e. Raiser’s Edge NXT (and other components are to come in the future).

I have tried this in the past without much luck. There were issues with the sky ux cli not working on my Windows machine (I am told the dev team use mainly Macs). Fast forward a few years and I thought that I would give it a go.

Before I start, I just wanted to say that the documentation that Blackbaud have produced for all aspects of the SKY developer platform is really very good. It is so much better than anything that they have ever produced in the past. I wanted to document this in case anybody should ever run into the same difficulty. It may be that you will never have this problem (especially if you, unlike me, actually read the prerequisites before starting!)

Configuration

I am primarily following the instructions here

However, before I can even start with that I needed to install the SKY UX sdk. This is done with the following:

npm install -g @blackbaud/skyux-cli

npm install -g @skyux-sdk/cli (thanks Ben Lambert for setting me straight!)

I then followed the instructions in the main article. I fired up Visual Studio Code, started a terminal window and went into my project directory.

skyux new -t addin

This was where the process failed for the first time. I got the error:

 addin template successfully cloned.
Setting @blackbaud/skyux version 2.54.1
Setting @blackbaud/skyux-builder version 1.36.0
× Running npm install (can take several minutes)
npm install failed.

Following this I ran a verbose version of this

skyux new -t addin --logLevel verbose

This gave me much more information. It told me that python was not installed on my system so it could not work.

I ran the following to install python. However this was not all plain sailing either… I ran this from an command prompt run as Administrator

npm install --global --production windows-build-tools

This first time this ran it told me that Python was installed successfully but then it just sat there for a while. Task manager told me that the command prompt was doing something but nothing happened on the screen. I waited for at least half an hour. The command prompt no longer appear to be working hard (according to task manager) so I broke out of the process (CTRL+C). I ran the same process again and it worked very quickly returning me back to the command prompt.

I then ran the skyux new command again and everything appeared to install.

Serving up the application

The next step according to the instructions was to “serve” the application. There is one, probably obvious step, that is missing from the instructions. In Visual Studio Code you need to open the folder where your project has been install. When you first open VSC assuming that no previous workspace or folder has been opened (I had closed mine) all you have is the welcome page. Click on the open folder button to show your app in the folder structure.

You can then use serve the app:

skyux serve -l local