Numbers Count: How to Use Data for More Effective Fundraising

Written by Taj Carson on . Posted in , Design

by Taj.

fundraising Let’s face it, fundraising can be one of the most dreaded aspects of running a nonprofit. A lot of people feel unprepared and apprehensive about it; asking for money is hard. But there are ways to make it a little easier, and more effective.

That’s where data come in.

Perhaps you think of data and fundraising as natural complements to each other. Perhaps you never considered using data in your fundraising efforts. And perhaps you aren’t even sure what qualifies as data. Well, I’m here to help. In this post I’ll go over how you can use data to support and enhance your fundraising efforts for more successful results.

Why Even Use Data?

You might be wondering, “Isn’t fundraising all about emotional appeal? So why even use data at all?” It’s true that people are motivated to give by compelling stories. But they also have to believe your story. Data, when used appropriately, is incredibly compelling. It can support your story with information, numbers, and facts, which are especially helpful if your audience is skeptical. And data visualization can further add a creative and emotional component to your fundraising campaigns.

But there’s more to using data than just throwing numbers at your audience. You can use it in-house, too, in order to create segmented and targeted fundraising campaigns, to support grant writing, and to grow your membership.

But before I get into all that, let’s clear up one thing. What is data? People often think data has to involve giant databases and lots of numbers. Both of those things are powerful (especially if you’re talking about audience segmentation), but data can also be viewed more broadly. As Jim Collins said in Good to Great and the Social Sectors:

“It doesn’t really matter whether you can quantify your results. What matters is that you rigorously assemble evidence—quantitative or qualitative—to track your progress. If the evidence is primarily qualitative, think like a trial lawyer assembling the combined body of evidence.”

Collins goes on to talk about what to do with quantitative evidence (think like a scientist), but I would argue that in this context you should visualize it and use it to support your qualitative information. Back up those stories with numbers, and they’re instantly more believable.

So data can be quantitative (demographics, number of donors, amounts given, member satisfaction ratings, etc.) or qualitative (personal stories about why people give, narratives about how your organization impacts people or policies, and even before and after images). The important thing is that you collect the right information on a regular basis to fuel your decisions and to show your impact.

This leads then to, how can you use data to support the fundraising and development efforts of your organization? In general you can use it to analyze where your funding comes from, which donors or organizations give more and under what circumstances, and how your social media strategy impacts your fundraising efforts. Social media data is a whole blog post on its own, but here are some other ways to use data for your fundraising efforts:

Grant Seeking

Grants are an integral part of funds development for nonprofits. Data can support your grant seeking efforts in several different ways:
  • You can use data in your proposals to support your case. Demonstrate the need for your services with census data, service data, and personal stories from your clients or community members.
  • Once you have the grant, you can use data to demonstrate to the funder that you used their funds as intended (services provided, people served, campaigns launched, partnerships formed, etc.). Donors and funders like to see proof that their money was well spent. They like to know who was served and how, even information about the quality of the services or the efforts.
  • If appropriate, you can also use data to show the impact of your work. What this looks like varies broadly. In some cases, the funder will have very specific requirements as to what they want you to report out on. In other cases, the requirements might be vague, and this gives you some room to be creative. For example, you could exclusively provide them with numbers about your programs and their successes, but it’s also very helpful to provide qualitative data, or stories, about the impact of your work. You can include images of your program, your community changes, and even videos. And you could use narrative information from individuals, group discussions, or town hall meetings to support your claims that the work that was funded had an impact.

Special Events

Many nonprofits have a big signature event every year or even several special events. Data can help you with this in a few ways:
  • You can use financial data to tell you how much was spent putting on the event compared to how much the event raised.
  • Data can tell you what money was raised from big donors, sponsors, or regular attendees. It can help you target attendees and sponsors year after year and look at trends.
  • You can also use data at the event to boost additional donations. Use images of clients and neighborhoods, combined with data about the need you are hoping to address, or data (perhaps some numbers combined with compelling quotes) about how your clients or communities have been helped by the work you do.


If you work with individual donors, there is a potential gold mine in your donor database. Make sure to keep the database up-to-date, regardless of what software you use, and you can start to see what type of donor gives regularly, what type gives more, and at what time of year they give. Some donors prefer to give monthly, some prefer to attend your big event, and others are great about donating in-kind services. If you can segment your donors using this data, you can target them based on the kind of giving they prefer.

You can also use data in your donor campaigns. People often do respond best to emotional appeals, and you can combine compelling emotional storytelling with solid sources of data about your issue, the need for your organization’s work, and the people and communities that will be impacted. This will reach people who are more inclined to donate based on emotional appeals, as well as those who prefer a more logical approach who are concerned with knowing that their money is well spent. Including data rounds out your appeal, making sure you’re reaching as many people as possible.


If memberships are an important part of your fundraising strategy, then recruiting and retaining members, as well as keeping them happy, are important contributors to the health of your organization. There is more to this than just surveys:
  • Data can help you identify which membership recruitment strategies are most effective. If you send out mailings or use social media to recruit members, you can track not only which efforts are most effective at increasing membership, but also what kind of members each strategy is likely to pull in.
  • Analysis can help you to segment your members, as you would with your donors, and use different strategies to reach out to them. Older members, long-time members, male vs. female members — there are many ways to think about members and why they are members, especially if you have a lot of them.
  • You can also use data collection to learn more about your members, how they feel about your organization, what they are looking for in a membership, and how you can improve in serving their needs. Here a survey is very helpful, but focus groups and interviews can also give you lots of good information.

So whatever your fundraising strategy is, you could be using data (quantitative AND qualitative) to support your efforts and make them more effective. Fundraising is hard work, so make it as effective and efficient as possible with data on your side.

Want me to write a blog post on a specific topic related to data and fundraising? Let me know!

Design & Evaluation: Radical Collaboration

Written by CRC on . Posted in Design

by Taj


(Pardon our silence over these past several months! After our unintentional hiatus, we’re be getting back into our blogging routine, sharing evaluation related news, tips, and tricks on a somewhat monthly basis. Starting with today’s post, the first in a series of posts about design and evaluation…)




Over the past few years, as CRC has explored and embraced visual thinking, information visualization, and the use of technology in evaluation, I’ve gotten a real world education in design, technology, and design thinking. I’ve done a lot of reading, even more experiments, and gotten an actual education in these areas through completing a Master’s in Information Visualization from the Maryland Institute College of Art.


Researchers have plenty to do. Keeping up with the current trends in their field(s), new data collection methods, the latest in propensity scoring, and the changing context of neighborhoods and communities, to say nothing of keeping an eye on changing funding priorities from foundations and government agencies … it’s a lot of work. So I wanted to make this part – incorporating design thinking – a little easier for you. Through this series, I’ll be sharing some insights to help you think differently about how you work, with hopes of starting a conversation about what the world of social sciences can learn from the world of design. (These thoughts are informed by many books, conversations, and conferences, but especially by the work of Don Norman and others in the Human Centered Design Field, that of the Institute of Design at Stanford, and of Tim Brown and his colleagues at IDEO.)  


This first post will focus on a principle of design thinking. Those that follow will talk about how we might adopt design frameworks, and take a look at Norman’s work on Human Centered Design, which examines how to concretely make things that people can and will use, and will actually enjoy using. (And if you’ve ever made a form that people hate, you know we can certainly learn a thing or two from Norman.)


Part 1: Radical Collaboration


There are many, many types of evaluation and research, and some focus on explicitly being collaborative, participatory, and/or empowering. But research is often a top down endeavor. It may require people like us, researchers and evaluators with advanced degrees and specialized skills. We know things others don’t. We know how to write survey questions, how to do representative sampling, how to conduct focus groups and analyze data. But non-researchers know things that we don’t. Radical collaboration involves acknowledging that, while we have some important specialized skills, we don’t hold all knowledge (in broad strokes, it means collaborating in a solutions-focused, action-oriented, rather than problems-focused way). In fact, this is the case even when we have all evident information.


Particularly when working with a program, we know that program staff have amazing insights into the research process. We do our best work when we find out why and how program staff interact with clients, especially around collecting information. They are the ones who can tell us whether our questions make sense, whether we are asking the right questions, or why no one is filling out that one field on the one form. They often know the best way to get information to us, and they also know what information they need from us.


They’re also excellent at helping interpret our data analysis. For example, we presented school-based health staff with a chart showing when students went to the health center. There was a huge spike in September. We all thought that was because students hadn’t been getting needed health care during the summer, so when they came back to school they went to the school nurse to get their health needs taken care of.


Fortunately, we kept our mouths shut and asked the staff what they thought the data pattern meant. They knew immediately. And we were so very off-base with our assumption (again … good thing we kept our mouths shut). “Oh, there is always a new school nurse in the fall, and all the kids ask to go to the nurse to see if she will let them out of class. She sends them right back to class, and by the end of September they stop trying to use that trick.”




But program staff are just the beginning. Working with survey participants can also be an opportunity for radical collaboration. Recently, we were working on a community survey, and we used cognitive interviewing to help us craft the questions. We had the potential respondents think out loud while they answered our initial questions, telling us what they thought we were asking them. This process highlighted areas where we clearly thought a question meant one thing, but these respondents interpreted the question totally differently. By simply listening to people, we were able to craft a survey that was much more valid BEFORE we sent it out to hundreds of people. A little collaboration saved us a lot of headaches.


Foundations also hold unique perspectives. Because they often have the resources and the staff to really dig deep into difficult social problems, they have insights into long-term and national trends, and the many intersecting factors that impact a particular issue. They may know the literature, the players, and all the sites nationwide who are working on the same problem, and what has worked for them and what has not. These insights can help fuel program development, implementation, design, interpretation of research and more. This deep knowledge can help smaller programs learn from larger efforts and avoid reinventing the wheel.


Radical collaboration means recognizing that everyone has the answers. It also means that everyone occupies a unique position and sees things based on where they stand. We may not always be wholly accurate, but everyone can offer a piece of the puzzle. Radical collaboration requires openness, a willingness to try new things and be open to new ideas, and to try out new strategies, even though they may not work out. But together the perspectives of all who are involved can create a more accurate picture of what the problem is and more innovative ideas about how to address it.


Check back soon for Part 2 in this series!

Tips & Tricks for Child Focus Groups, Part 2

Written by CRC on . Posted in

by Mandi Singleton




(Note: this post is the second part of a two-part series.)


As I mentioned in the my last blog post, one of my favorite things about my job at CRC is conducting focus groups. Focus groups with elementary school students can be the most challenging and the most fun for me as a focus group facilitator. Here in part two of my discussion of tips & tricks for doing focus groups with kids, I get into strategies that make for effective and enjoyable groups.


5. Make it fun with hands-on-activities! Studies show that incorporating hands-on activities in focus groups with school-aged children increases participation and stimulates discussion. In focus groups I’ve conducted, I led children in several hands-on activities as part of data collection. During one activity, children were given four paddles with faces on them (very happy, happy, sad, and angry) and instructed to hold up the paddle that reflected how they felt in response to statements read aloud.


Exhibit A. The Paddles

Exhibit A. The Paddles


Another activity I’ve done with kids involves them responding to statements by placing stickers on posters, which incorporated the same four expressive faces as the paddles.


Exhibit B. The Posters.

Exhibit B. The Posters.


Other non-verbal forms of response are effective for use with kids, and multiple types of queries can be used together in one group. For example, along with using the posters in one group I also asked the children to complete a drawing activity in which I instructed them to draw their favorite and least favorite things about their afterschool program.


Exhibit C. Time to Draw!

Exhibit C. Time to Draw!


Exhibit D. A Favorite Thing.

Exhibit D. A Favorite Thing.


Exhibit E. A Least Favorite Thing.

Exhibit E. A Least Favorite Thing.


Implementing all of the above hands-on activities has been successful for me, appearing to boost kid’s engagement and stimulating discussion. I’ve noticed that the sticker-poster activity has been more conducive, compared to the paddles, for eliciting honest responses (perhaps because it can be too much fun to wave different faces!). And the drawing activity has stimulated discussions that I believe would have never happened if children were only asked for verbal responses.


6. Watch the clock.. Response quality declines in child focus group sessions lasting longer than 45 minutes. To avoid participant fatigue and promote thoughtful responses, research suggests that focus groups involving school-aged children shouldn’t run any longer than 45 minutes and should include breaks for refreshments.


The groups I’ve facilitated have averaged 35-45 minutes and, although there were no breaks included, children remained attentive and actively engaged throughout the entirety of sessions. I attribute their attentiveness and active engagement to the short duration of the focus groups, along with the hands-on activities I included. Plan carefully for your choice of activity and timing, though, because they can take longer than you might expect. All-in-all, short time frames and activities have kept me on my toes as a facilitator but definitely kept the kids happily busy and more open to sharing information, too.


7. Watch for signs of distress! When conducting focus groups with young children, it is extremely important to maintain awareness of group dynamics even as you try to keep things fun and productively moving along. Young children can become easily distressed when discussing sensitive or personal topics.


For example, in my experience, I’ve had one student bring up bullying as her least favorite thing about afterschool programming. When this happened, efforts were made to ensure the student was in control of how much she disclosed about the bullying. When I inquired for more detail with follow-up questions, I was careful to ask if “any students in the program had been bullied” versus if she had been bullied. Formatting the follow-up question in this manner gave the student the option to choose how much she disclosed and enabled her to discuss the issue without it becoming too personal or distressing.


Concluding thoughts


I hope that my experiences and the strategies I’ve described help you in considering the key factors that impact child participant involvement, levels of engagement, and production of thoughtful responses during focus group sessions. Before conducting such focus groups, I had concerns about engaging very young children. However, contrary to how I imagined the groups would go, kids I’ve worked with have not been rambunctious or inattentive; they were enthusiastic and sometimes less focused, yes, but they were still active participants who were able to reflect on and effectively communicate their personal experiences. I’ve enjoyed seeing how excited children are to give me their opinions on issues.


Have you ever conducted focus groups with young children? Do you have any funny stories or suggestions? Please leave a comment and share your experiences with us!



Further reading:


1. Hearing children’s voices: methodological issues in conducting focus groups with children aged 7-11 years (Myfanwy Morgan, Sara Gibbs, Krista Maxwell and Nicky Britten, Qualitative Research 2002)


2. Interviews and focus groups with children: Methods that match children’s developing competencies (Gibson, 2012)


3. Focus on qualitative methods: Interviewing children (Sharron Docherty, Margarete Sandelowski, 1999)

Tips & Tricks for Child Focus Groups, Part 1

Written by CRC on . Posted in

by Mandi Singleton


(Note: this post is the first part of a two-part series.)


Elementary classroom. Focus on teacher standing in front of chalkboard.

First tip — don’t expect things to be this orderly!

One of my favorite things about my job is conducting focus groups. I enjoy the opportunity it gives me to interact with people, capturing and learning from their thoughts and feelings about experiences they’ve had. While at CRC I’ve had the opportunity to facilitate a series of focus groups with elementary school students.


Although many of my projects are education-related, I had never done a group with children so young before. The focus groups I’d done in the past involved middle grade students, parents, and school staff, so the thought of conducting focus groups with elementary school students made me a little nervous.


I could just imagine rambunctious 6 to 10 year olds, hopped up on sugar and far too excited to break away from their schools’ typical routines and reigns of control to participate in a focus group. I guess my main concerns in conducting focus groups with such young children were getting them involved, keeping them engaged, and capturing genuine but thoughtful responses.


Because school-aged children are still developing (physically, socially, emotionally, cognitively), the way they think, communicate, and interact with others differs from adults. These developmental differences point to the importance of identifying focus group strategies that are specifically catered to children’s communication competencies, as techniques used in focus groups with adults would not be effective. My overall goals for focus groups with young children are to ensure that the participants understand my questions, have the opportunity to reflect on their own experiences, and as a result can effectively communicate their thoughts and feelings.


Thankfully, I’ve found that by using the right strategies that my young focus group participants’ excitement eventually succumbed to attentiveness as the group format played to their inquisitive natures.


scsckids Here are some of the tips and tricks I’ve found to work for focus groups with children:


1. Be mindful of group composition. To increase involvement, levels of engagement, and quality of responses, research suggests limiting groups to four to six participants that are no more than two years apart in age or level of development. In my experience, I’ve been able to limit each session to six children; for example, one group was conducted with 1st and 2nd graders, while another only included 3rd and 4th graders. I found that limiting participants was beneficial in fostering engagement, while controlling for large age-discrepancies seemed to help prevent students’ responses from being overly influenced by their peers.


2. Build a trusting atmosphere and relaxed setting. Children are more likely to be engaged by focus groups that foster relaxed settings where they feel comfortable enough to express their thoughts and feelings. To facilitate this type of setting, research suggests that moderators use ice-breaker games, engage in casual (but age appropriate) conversation with participants before the start of the session, portray a friendly and relaxed manner, and encourage the use of first names. In my focus groups with children, participants were invited to do an ice-breaker activity at the beginning of the session, which did build trust between participants and helped them to relax. The students paired up with a partner in order to learn something about each other, and took turns introducing their partner to the rest of the group. The result was a relatively quiet group of children, more comfortable with each other, who then became more talkative in an appropriate way as the session progressed. Fostering a certain atmosphere when doing groups in schools is especially important; I’ve found it most effective for children to view me, as the moderator, in a more informal way than they do their teachers to encourage their honest responses.


3. Establish ground rules. Research suggests establishing ground rules at the start of each focus group, as they help children understand their role in the group, what is expected from them, and what they can expect from the moderator. At the beginning of each session, I’ve asked participants to abide by basic discussion rules (e.g. be respectful, be good listeners) and informed them why I wanted to talk with them. I let them know anything they said in the group would not be shared with anyone else with their names attached, and that they didn’t have to respond to any questions they didn’t want to. Before starting the focus group, children were also given the opportunity to ask any questions they had. Note that challenges have arisen for me in soliciting honest responses; this occurred when children observed peers and wanted to model and/or conform to peers’ responses. However, I was able to resolve these situations by varying my methods (more about this in Part 2).


4. Consider your Interview structure and question formation. Research on focus groups supports that groups with school-aged children should start off with simple questions that can be answered with brief one word responses (e.g., yes or no) and progress to more complex or multipart questions. This eases children into the interview process, making them more comfortable with responding to the moderator. The full focus group guide should primarily consist of open-ended questions, with direct questions only used as a means to clarify or elicit more detail on a response. Close attention should be paid to the wording of questions to ensure age appropriateness and that students understand what they are being asked. In groups I’ve facilitated, children were read statements that probed for feelings about their social lives and interests in math, reading, and science. The focus groups started with a few warm-up questions that asked about their feelings towards vanilla ice cream and rainy days; not only were these questions helpful in getting the children comfortable with the interview process, but they also reassured me as the moderator that the children understood how to correctly respond using tools I provided to support non-verbal responses to augment verbal ones (more on this, also, in Part 2). Responses elicited during a drawing activity, for example, were followed-up with more direct questions in an effort to stimulate additional discussion and gain further insight.


I hope that the above tips give you some food for thought and a starting point for your data collection with this unique population. Stay tuned for Part 2, including how to engage your groups with FUN activities, coming later this month!

Visual Reports

Written by Taj Carson on . Posted in , Dataviz, Technology and Customer Service

By: Sheila

Several weeks ago, one of our clients came to us with a challenge: find compelling ways to present 10 years of grantmaking data. The client wanted us to tell their story and present the data in a way that people at all levels (data nerds and non-data nerds) at their organization could easily understand.

I was tasked with analyzing the data and worked closely with the CRC dataviz experts, Taj and Matthew, to come up with the different visuals for the report. I’m no dataviz expert but here’s what I learned:
  • Client feedback is important: Take time to hear your client’s thoughts on the visualizations you are creating, you want to make sure you are meeting your client’s expectations.
  • Patience is key: I spent a lot of time creating and re-creating multiple charts and graphs. It takes time to make sure every visual aligns with the story you are trying to tell. If you need a break, eat a muffin.
  • Don’t be afraid to ask for help: If you get stuck, ask a colleague or check out online forums to see if there is a solution to the problem you’re having. Google is a great friend.
  • Sketch!: Take time to sketch out what you want your visuals to look like. Trust me, it’ll save you a lot of time in the end.
  • Don’t be afraid to try: I made about 40-50 visuals for this project. Around half were rejected by the design team but I learned a lot throughout the process:
    • Pie charts are not your friend
    • No one at CRC likes pink or mustard yellow
    • Embrace white space
    • Not every visualization needs to be a bar graph
    • Embrace awesomeness

In the end, we created a pretty cool report for our client that they really liked. Since we can’t share the report online for proprietary purposes, we created a similar report for this blog. Take a look and share your thoughts.


Also, don’t forget to check out our summer webinar series:

Data Systems: Where and How to Store Your Data

July 15, 2015 :12-1pm

Sarah will explore different options for data storage systems and solutions for those who seek to streamline their data collection and storage processes. This learning session will focus on selecting the best software for your specific needs and organizing your program data.

Microsoft Excel Magic: Developing Mesmerizing Charts that Enchant & Engage your Audience

July 29, 2015: 12-1pm

This webinar will focus on the effective visual representation of data using charts that can be created in Microsoft Excel. The strategic use of design principles and the practical skills needed to create charts that are visually appealing, functional, and informative will be addressed

Each webinar costs $25. Sign up here:

And this great map of outdoor film locations in Baltimore created by Matthew:

Happy Summer!


What Breastfeeding in the U.S. Looks Like

Written by CRC on . Posted in

CRC’s dataviz team recently completed a comprehensive and beautiful infographic documenting breastfeeding statistics in the United States. Our hope is that this infographic can play a part in spreading the word about this important issue.


Screen Shot 2015-05-29 at 10.47.55 AM


From a public health standpoint, the medical benefits of breastfeeding are well established.* Breast milk provides babies with all the necessary fats, proteins, and vitamins they need for healthy growth and development. Among other benefits, antibodies in breast milk can help babies fight infections and reduce the risk developing asthma and allergies. Moms who breastfeed experience benefits too. Breastfeeding can help mom lose her pregnancy weight (through the calories it burns) and can protect her against breast and ovarian cancers. In addition to the physical benefits, time spent breastfeeding also helps nurture the bond between mom and baby.


See the full infographic here:!breastfeeding-in-the-usa/ccrt


* Although the above benefits of breastfeeding are established in current research, we would like to acknowledge that other bodies of research demonstrate that babies who are not breastfed also can have healthy outcomes and bond well with moms. Many moms cannot or may not choose to breastfeed for a wide variety of reasons, and we support all moms in their choices.

We LOVE Maps: Map out your summer!

Written by CRC on . Posted in

As we’ve said in the past, we at CRC LOVE MAPS. They’re useful and (often) beautiful, helping us to make all kinds of decisions in research and daily life.


Some of you have been following our interest in maps at the Baltimore DataMind blog, but to make sure more of our readers get to see that content, starting with this post we’re “folding” the BDM blog in here. So now, along with the evaluation news, data tricks, and dataviz tips you’ve come to expect from CRC’s blog, expect to learn more about making and using maps. We also hope to show you a lot of just plain cool ones, starting with this one, created by our own Matthew Earls.


Baltimoreans’ love of festivals is possibly even greater than our love of maps! This map uses an interactive and chronological format to map out all the festivals in-town this summer. Use it to map out your summer plans!



A Word About Baltimore City’s Snappy Budget Graphic….

Written by CRC on . Posted in

By: Taj
Infographics are all the rage. They are beautiful, engaging, and fun to look at. This one is no exception:    
Available for download here:

Available for download here:

At first glance, it looks like a lot of fun. The Finance Office has done something unexpected, which is trying to make understanding the budget of Baltimore City a bit easier by using data visualization. It is likely to be successful in that more people will look at this than might read a website that breaks out funding by categories, or talks about the property rate; but, it also leaves a lot to be desired.

Here are a few things we would suggest to the City of Baltimore for next time:

Pie charts
Be careful with them. Unlike other data visualization folks, I’m not completely opposed to the pie chart in all situations. But, if you are going to use them, you should have the slices go from the largest (property taxes at 32%) to the smallest (other at 4%) slices. This one does work okay because you don’t have a million slices.

The colors seem randomly chosen (except for the green tree and blue water) and the color palette on the top half of the infographic looks unrelated to the bottom half. Pick a few colors and stick with them throughout so the piece looks consistent. Consider gradient colors (i.e., a color fill that gradually blends from one color to another) for the Priority Outcomes.

The bottom graphic is hard to read. It might be the case that the color of the Priority Outcome is supposed to match the color of the pie slice, but it’s not clear. Are the colors communicating information or if is this just the color palette? If the former, you made a good choice as a way to unify the piece, at least the bottom of the piece. However, a bit of labelling of the bottom pie chart (like the top one) would have cleared this up.

And… What’s with the arrows? What are they pointing at?

We applaud the Finance Department for using data visualization to make the budget more accessible to citizens. And we love the clean look and feel of this piece. We’d love to see them use some of the best practices in the field of data visualization to make it even more impactful!

As a final note, thank you for NOT putting the City seal on this.

Why the Hell was Taj at a Design Conference and What Did She Learn There?

Written by CRC on . Posted in , CRC Team, Dataviz, Design

By: Taj
i15banner     Last week I was at the Interaction Design conference. Now, you probably know that evaluation and design don’t exactly go hand-in-hand, so I understand if your next thought is a befuddled, “Huh?” You don’t usually find evaluators at a design conference.     1-2So, was I just hopping a plane to San Francisco in February because of the awesome weather and the cool town? Well, not exactly, although I have to say those were both nice perks of being there. Did I travel to San Fran for some much needed R&R and to clear my head? Not entirely, but I do always feel inspired and innovative when I’m there. (And it’s a bit surreal to go by the Uber headquarters in your Uber.)             The truth is, I’ve been doing a ton of reading and thinking about design lately. The more I learn, the more I find that there is a common ground between the way we do our work here at CRC and the work of design firms. I’ll be writing a series of blogs in the future about how we use design thinking in our approach to research and evaluation, but for now I’ll just quote Phi Hong Ha, who said in her conference talk, “Design is about helping people to make sense of the world.”   That’s what we evaluators do, too! Help make sense of the world. It’s our job to help people understand what is going on, what their clients are thinking and feeling, and what the reality is of the communities they work in and what the impact is of their programs.     Design thinking, combined with CRC’s love of beautiful data, makes a design conference the perfect place for me to learn new things, embrace old ways of doing things with a new language, and continue to be inspired. As Tim Brown (Tim Brown!) said, “Information is the material we are most using in design today.”   TimBrown     If that doesn’t sound relevant to what we do here at CRC, I don’t know what does.     So, what did I get out of this conference?     dansaffer_1291256779_0Creative innovation is risky. Smaller firms and individuals are often able to innovate, try new things, and “play” more than larger firms because we smaller firms have a higher tolerance for risk and less bureaucracy to fight.One barrier you encounter to creativity, as Dan Saffer pointed out, is that “Efficiency is the enemy of creativity.” Creativity takes time. And it’s not always linear. Researchers love linear—linear relationships, linear regression. Here at CRC we have a certain tolerance for a bit of meandering if it means a better process, a better relationship, and a better outcome.     Things never turn out like you think they will. So stop expecting them to. It’s been our experience that projects never go exactly as intended. That’s because our work involves people, not widgets (or pharmaceuticals). And people change, make decisions, and adapt constantly. Jan Chipchase said, “Once you begin, assume that everything you planned is not relevant.” While this is an extreme perspective, I have found it very helpful to let go of the idea that nothing will change. Being flexible in our thinking has worked for us very well, and it’s a constant in the design world.     Silicon Valley is full of people telling you failure is great. “Fail fast” they say. Foundations especially struggle with this. They often hold onto approaches or initiatives long after they should let go and move on to new and innovative ideas. Everyone struggles with change, individuals and organizations. But Saffer made a good point. “Failure sucks!” he said. Learning from failure is where it’s at. We are constantly trying out new approaches and new tools. Some of them succeed, some of them fail. But I have yet to try out something new that we didn’t learn from, whether it was software, process, or research methods.     Risk     Another theme of design work is empathy. The idea of empathy often makes researchers uncomfortable. We embrace empathy here at CRC. Watching people work with their clients or fill out forms or struggle with databases gives us empathy for how we might develop solutions to make their work easier. Indi Young did a great job of defining empathy in a way that I think even evaluators and researchers can embrace. She differentiated between “emotional empathy” (feeling what someone is feeling) and “cognitive empathy”, which is about understanding how people are thinking and why and how they react.     I would argue that we need some of both because we also need to understand the emotional state people are in when we are asking them to fill out forms, take attendance, complete surveys, etc. Sometimes they are upset, frustrated, or anxious (forms can bring it out in the best of us). But cognitive empathy applies too. How do people see and understand our data collection tools? How are they interpreting the questions we ask on surveys (hello, cognitive interviewing)? What makes them skip over some parts of the forms they fill out? Observing and talking with people about these things specifically can be very fruitful.     dani2Danielle Malik wins the award for best presentation title with “Go Home Data, You’re Drunk.” She talks about how future trends will be all about analytics and customization (even more so than now). Because of the visual storytelling we do here, I was excited to hear “data points are the words, and it’s up to us to construct the sentences.” Her presentation resonated with me, as she focused on the fact that data is not neutral. You have to constantly think critically about where it comes from, why you collect it, and how you intend to use it. And we’ll definitely be looking for ways to use the hashtag from her presentation … #drunkdata.         I think evaluators have much to learn from the design community. Design is user centered, process oriented, and collaborative. The design process requires that you empathize with your users, understand what their problems are, come up with creative solutions, test them, and then build in a process of iteration and tweaking until you get where you need to be.     I think evaluation should be more like design. I will admit that I often have doubts about the way we design and implement evaluations in this field. Evaluators often do not take the users into account. Some evaluators work in virtual isolation from their end-users. How many actually spend time in the schools, public health clinics, and program sites where their data collection takes place (besides us, of course)? How many actually talk to the people filling out the surveys to find out what they think was meant? Karl Fass, professor of User Interface design said, “We shouldn’t be using the vocabulary of natural science,” and I agree. I often question why we use the vocabulary AND the methods of the natural sciences in a contest that is very human-centered and (let’s face it) often chaotic.     So I leave you with these questions….     What would happen if we used human-centered design principles and user testing to develop and evaluate social programs instead of evidence-based practices? [1]     What if we began every program implementation with empathy, planned it collaboratively with the end user in mind, and assumed that some iteration would take place before it got to where it needed to be?     What if we conducted our evaluations not like science experiments but like user analytics?     What if we were careful to collect only what we (or our clients) needed, to constantly review the data and collaboratively make decisions with it, and to not assume there was a beginning or an end point? How revolutionary would that be?     Well, if I’m going to be truthful, we know that good evaluation practices do these things, even though we don’t always like to talk about it. But in my upcoming posts on design thinking, I plan to do just that.   SteveJobs     [1] It’s a radical notion, but it’s one worth considering. Large, elaborate studies are conducted using the principals of the natural sciences. Then we plop those programs into a variety of locations as if context doesn’t matter. Designers know that context always matters. And that users can be almost infinitely segmented.

Secrets from the Data Cave: January 2015

Written by CRC on . Posted in Technology and Customer Service

By: Sarah


 Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a sneak peek into the cave, with tips and the latest updates on what we’re implementing here at CRC.

(This month’s SDC is an on-the-road edition!)




Late last fall, I had the opportunity to go to phpworld 2014, a conference for PHP code developers, held in Washington DC.


Many different web applications use the programming language PHP, so coders have a myriad of options for deploying it (including, but not limited to, Drupal, WordPress, Joomla, and Magneto). This conference is designed to bring these communities together in order for developers to learn about how others are using this language, and get ideas for enhancing existing applications.


The conference sessions that fascinated me the most were those about web security. PHP is great for developers because it is a very powerful language, meaning you can do a lot with it. But, without the proper precautions, hackers can exploit that power over your applications and cause huge headaches.




One conference presenter gave a great example of how this might look in the real world. As context for the example, you need to know that there are certain combinations of coding language that, should a hacker type them into your application and you haven’t protected your database, the application will interpret them as legitimate SQL code and “drop” all the tables in your database. Therefore a hacker could use this technique (called “SQL injection”) with a “drop database table” command – feeding the application some code that masquerades as part of the original developer’s code— to irreversibly erase all of your data.



With this in mind, consider the presenter’s example: A would-be hacker in Europe rigged up a fake license plate with a bit of SQL injection code on it. He was exploiting the fact that traffic cameras use picture-parsing technology to break down an image of a license plate into individual characters (to record the plate info of speeders).




So, this was presumably an effort to trick the traffic camera into parsing the code, feeding the data in as it would any license plate. This would have ultimately caused all the stored traffic data on the back-end to be wiped clean via the SQL injection! Now, it’s unlikely that this actually worked in practice. But, it did get me thinking about the lengths hackers will go to in order to mess with your data.


  All in all, it was an excellent conference, and I feel I learned a lot about this powerful coding language. I’m now looking forward to connecting with more PHP developers in future!