Members Only Logo  
XML

or Subscribe by Email by entering your address below:


Powered by FeedBlitz
Learn about Subscriptions Follow me on Twitter!

The topics discussed here grow out of the bread-and-butter issues that confront my consulting and software clients on a daily basis. We'll talk about prosaic stuff like Membership Management, Meetings and Events Management and Fundraising, broader ideas like security and software project management, and the social, cultural, and organizational issues that impact IT decision-making.

Tuesday, June 13, 2006

Fundraising Lab

Last Friday Jeff Brooks had an interesting little piece on his Donor Power blog about testing your development efforts. His message is to avoid just jumping in to new creative ventures without really testing the value of your fundraising offers. He quotes a piece from a site called Inside Direct Mail that warns:
There are no shortcuts to direct marketing results ... thoughtful, solid and constant offer testing is the only way to get there ....
Jeff points out that your offer can be tested along many dimensions - he suggests playing with the the amount you ask for, the specifics of the call to action, adding leverage such a matching gifts program, and adding specificity to the accompanying program descriptions.

Jeff thinks this improvement is much more likely to come from fine tuning your offer than in playing with the quality of the material:
Spiffy new creative is exciting and fun. (Heck, I'm a creative director -- that's what I like best!) But the real leverage is farther upstream with the offer, or, in fundraising terms, the call to action.
But Jeff's post doesn't delve into the details of how to do this offer testing.

It's an issue of experimental design. You need to make sure that you try all combinations of each of the factors you are testing, so that you actually learn which factor contributed to an observation. For example, if you send Offer A to List 1 and Offer B to List 2, you do not know if the Offer or the List accounts for the difference. Similarly, you will want to make sure you vary single parameters between the offers. Don't try the high ask only on the big packet and the low ask on the simple packet. If you do, again you cannot separate the impact of the factors. And you'll need to make sure that each donation you receive is fully coded so that you know which list and which offer solicited it.

Finally you need to do some analysis on what you received. To do this, you need to look at the results along specific dimensions. Your software can give you a lot of help in the mechanics, but to make sure you are really using statistical tools properly, you might want to get a grad student volunteer to help you analyse your data. Better yet, involve her from the start of the campaign, so the whole effort doubles as also a valuable experiment, contributing to continuous improvement in your development efforts.

Tags: ,

Comments on "Fundraising Lab"

 

post a comment