Visualizing Program Capacity: Design with your client in mind

Written by CRC on . Posted in , Customer Satisfaction Research, Dataviz, Design

pc_blog_header by Dana & Jill


Over the past few blog entries, Taj has shared lessons learned about design thinking that you can apply to your work (here and here). Taj will be continuing that series soon, but in the meantime, we wanted to share a related example of creating a simple, but effective, visualization for a client.




A local agency wanted to track client capacity on a monthly basis. This agency oversees services to pregnant women across multiple program locations, so tracking such information is necessary not only for their oversight of services, but also for sound management of dollars received from their funder. Each service program under our client’s purview has an individualized contracted site capacity (i.e., maximum number of clients that they could serve), and dealt with an influx of clients enrolling and withdrawing from services each month. The agency needed something that could summarize, at a glance, all the information that they wanted to see. Because CRC was already the evaluator on the project, the agency reached out to us to develop a visual tool to help them with this specific need to improve their tracking and reporting. CRC staff were familiar with the critical elements of the work (i.e. contracted site capacity, monthly site capacity, number of clients enrolled/discharged each month) that would need to be incorporated into the visualization.


We wanted to create a visualization that would allow our client to view accurate program capacity each month, observe trends over time, and, also, to have a management tool for determining factors influencing any substantial enrollment changes.


Process of developing the visualization


Once the necessary data were gathered, CRC staff began brainstorming. We first used charting tools available in Excel, such as line chart and bar chart, but quickly realized that only two or three data elements could be easily displayed in a such chars. We then began rough sketching ideas for how all the data could be displayed within one visual. Because we wanted to visually communicate enrollment relative to capacity at-a-glance, our sketching led us to the infamous “bubble”. We’re well aware of the critiques that have been leveled against such charts, but ultimately went with it here because it allowed us to visualize a data set with up to four dimensions, all in an easy to interpret chart. After our rough sketches were drafted on paper, basic graphic design software was used to recreate them electronically. The sketches were shared for internal review, and edited before a draft was shared with the client. Once the client was satisfied with the layout and information presented, the tool was recreated for each of the five programs within the agency. A separate version summarizing the capacity data for the agency, overall, was also developed and provided to the client.


Final Product


The resulting data visualization tool was updated each month by CRC staff. Monthly data for each program were provided to CRC by the client, and our staff then made updates to the visualizations. The final product tells a prospective “story” of capacity for each of the agency’s programs and the agency as a comprehensive whole. (See below example of one of the individual program visualizations.)




Although this is not the most complex visualization that CRC has created, it provides a good example of the importance of working collaboratively and thinking just enough outside of the box to provide clients with what they actually need at a given time. And often that’s what it’s all about!

Design & Evaluation: Focus on Human Values

Written by CRC on . Posted in

by Taj



This is the second post in a series about design thinking in evaluation. The goal of this series is to share insights from the world of design that may help you think differently about how you work and, hopefully, start a conversation about what the world of social sciences can learn from the world of design. If you missed Part 1 about radical collaboration, check it out here. This time around we’re focusing on another key idea in the design thinking world: human values. As evaluators, we may deal largely in numbers and spend a lot of time in front of spreadsheets. Yet we can’t forget the real reason we do what we do – helping people. And to help people, we have to understand where they’re coming from. The Institute of Design at Stanford describes it this way: “Empathy for the people you are designing for and feedback from these users is fundamental to good design.”


teapotWhat does empathy mean for designers in all fields, not just evaluation? Well, designers have certainly been known to create things that are beautiful, but maybe a little bit esoteric. The white couch that you can’t sit on without wrinkling its perfect linen upholstery. The beautiful teapot that doesn’t have a handle. The elegant solution to a public health issue that ignores the values of the people in the community. We work with people who are passionate about people. People helping people. People who are embroiled in intense emotional and even life-threatening situations. But when we design evaluations, we sometimes forget to think about those people, their values, and what they are experiencing. 


A focus on human values should be obvious, but many times it’s overshadowed by concerns about rigor, replication, and publication. We feel that if we customize our research too much, we won’t be able to generalize and share it. But if we don’t, we risk creating evaluation tools and designs that don’t work for our clients or, even worse, that ignore their everyday realities altogether.


I once encountered an evaluator who was designing an outcome survey for a group of adolescent boys involved with the juvenile justice system. These adolescents participated in a male mentoring program. The program was fairly limited, with only eight two-hour sessions, and many of the boys struggled greatly in school, especially with reading. The evaluator developed a lovely survey that was totally reliable and valid. The problem was that it was eight pages long. If this evaluator had thought about the actual people who would be filling this survey out, she would have realized that the survey was bound to invoke testing anxiety and fear of failure among the youth. Not only that, but from a program administrator’s point of view, the survey would take up far too much of the short, valuable time that they had to spend with the boys. Methodological considerations aside (and there were many), the approach was far from human-centered. 


In another instance, we worked with an out of school time program that was struggling with tracking daily attendance for its students. At first, we were very frustrated and couldn’t understand why they were having so much trouble checking off names on a list. So we went to the school and observed the beginning of the program and the check-in process. Pretty quickly we let go of OUR frustration and understood THEIR frustration. They had a list with names, but kids were running in and out of the program, the way kids do after school and before they get settled into their next activity. Teachers would check a name off, but then the kid would run out and back in, making it hard for the teacher to be sure if they had checked off that kid or if they accidentally checked off a different kid. With 100 children in the room, it was hard to keep track. So we created a simple system for them that involved cards and hand-held scanners. The teachers scanned the students’ cards when they came in, and the attendance was recorded in an Excel spreadsheet. This saved time and ensured that if a kid wasn’t enrolled, there would be no card, and if the kid had already been recorded, the system wouldn’t record it again. Overall, it saved tons of time and energy and …frustration.


In creating solutions for both of the above situations we also incorporated feedback from users. We listened (probably the most important skill we have as consultants) to the concerns of our clients, talked to them about how the boys were reacting to the surveys, observed the classrooms where the attendance system was used. There is always room to ask clients how the tools we are using are working for them, to find out how things look in the real world and, whenever possible, we love to observe our work in action. It often looks different from what we expected.


Our scientific selves and our training are often at odds with our relationships with clients. It’s important to provide clients with the best data possible, so they can make the best decisions possible and use that data to tell their stories. It’s just as important to remember that real people with real constraints, frustrations, and often chaos all around them are the ones who will be filling out our forms and using our information. 


Really, we do better work when we think about the people we’re doing the work for.


Check back soon for Part 3 in this series!

Numbers Count: How to Use Data for More Effective Fundraising

Written by Taj Carson on . Posted in , Design

by Taj.

fundraising Let’s face it, fundraising can be one of the most dreaded aspects of running a nonprofit. A lot of people feel unprepared and apprehensive about it; asking for money is hard. But there are ways to make it a little easier, and more effective.

That’s where data come in.

Perhaps you think of data and fundraising as natural complements to each other. Perhaps you never considered using data in your fundraising efforts. And perhaps you aren’t even sure what qualifies as data. Well, I’m here to help. In this post I’ll go over how you can use data to support and enhance your fundraising efforts for more successful results.

Why Even Use Data?

You might be wondering, “Isn’t fundraising all about emotional appeal? So why even use data at all?” It’s true that people are motivated to give by compelling stories. But they also have to believe your story. Data, when used appropriately, is incredibly compelling. It can support your story with information, numbers, and facts, which are especially helpful if your audience is skeptical. And data visualization can further add a creative and emotional component to your fundraising campaigns.

But there’s more to using data than just throwing numbers at your audience. You can use it in-house, too, in order to create segmented and targeted fundraising campaigns, to support grant writing, and to grow your membership.

But before I get into all that, let’s clear up one thing. What is data? People often think data has to involve giant databases and lots of numbers. Both of those things are powerful (especially if you’re talking about audience segmentation), but data can also be viewed more broadly. As Jim Collins said in Good to Great and the Social Sectors:

“It doesn’t really matter whether you can quantify your results. What matters is that you rigorously assemble evidence—quantitative or qualitative—to track your progress. If the evidence is primarily qualitative, think like a trial lawyer assembling the combined body of evidence.”

Collins goes on to talk about what to do with quantitative evidence (think like a scientist), but I would argue that in this context you should visualize it and use it to support your qualitative information. Back up those stories with numbers, and they’re instantly more believable.

So data can be quantitative (demographics, number of donors, amounts given, member satisfaction ratings, etc.) or qualitative (personal stories about why people give, narratives about how your organization impacts people or policies, and even before and after images). The important thing is that you collect the right information on a regular basis to fuel your decisions and to show your impact.

This leads then to, how can you use data to support the fundraising and development efforts of your organization? In general you can use it to analyze where your funding comes from, which donors or organizations give more and under what circumstances, and how your social media strategy impacts your fundraising efforts. Social media data is a whole blog post on its own, but here are some other ways to use data for your fundraising efforts:

Grant Seeking

Grants are an integral part of funds development for nonprofits. Data can support your grant seeking efforts in several different ways:
  • You can use data in your proposals to support your case. Demonstrate the need for your services with census data, service data, and personal stories from your clients or community members.
  • Once you have the grant, you can use data to demonstrate to the funder that you used their funds as intended (services provided, people served, campaigns launched, partnerships formed, etc.). Donors and funders like to see proof that their money was well spent. They like to know who was served and how, even information about the quality of the services or the efforts.
  • If appropriate, you can also use data to show the impact of your work. What this looks like varies broadly. In some cases, the funder will have very specific requirements as to what they want you to report out on. In other cases, the requirements might be vague, and this gives you some room to be creative. For example, you could exclusively provide them with numbers about your programs and their successes, but it’s also very helpful to provide qualitative data, or stories, about the impact of your work. You can include images of your program, your community changes, and even videos. And you could use narrative information from individuals, group discussions, or town hall meetings to support your claims that the work that was funded had an impact.

Special Events

Many nonprofits have a big signature event every year or even several special events. Data can help you with this in a few ways:
  • You can use financial data to tell you how much was spent putting on the event compared to how much the event raised.
  • Data can tell you what money was raised from big donors, sponsors, or regular attendees. It can help you target attendees and sponsors year after year and look at trends.
  • You can also use data at the event to boost additional donations. Use images of clients and neighborhoods, combined with data about the need you are hoping to address, or data (perhaps some numbers combined with compelling quotes) about how your clients or communities have been helped by the work you do.


If you work with individual donors, there is a potential gold mine in your donor database. Make sure to keep the database up-to-date, regardless of what software you use, and you can start to see what type of donor gives regularly, what type gives more, and at what time of year they give. Some donors prefer to give monthly, some prefer to attend your big event, and others are great about donating in-kind services. If you can segment your donors using this data, you can target them based on the kind of giving they prefer.

You can also use data in your donor campaigns. People often do respond best to emotional appeals, and you can combine compelling emotional storytelling with solid sources of data about your issue, the need for your organization’s work, and the people and communities that will be impacted. This will reach people who are more inclined to donate based on emotional appeals, as well as those who prefer a more logical approach who are concerned with knowing that their money is well spent. Including data rounds out your appeal, making sure you’re reaching as many people as possible.


If memberships are an important part of your fundraising strategy, then recruiting and retaining members, as well as keeping them happy, are important contributors to the health of your organization. There is more to this than just surveys:
  • Data can help you identify which membership recruitment strategies are most effective. If you send out mailings or use social media to recruit members, you can track not only which efforts are most effective at increasing membership, but also what kind of members each strategy is likely to pull in.
  • Analysis can help you to segment your members, as you would with your donors, and use different strategies to reach out to them. Older members, long-time members, male vs. female members — there are many ways to think about members and why they are members, especially if you have a lot of them.
  • You can also use data collection to learn more about your members, how they feel about your organization, what they are looking for in a membership, and how you can improve in serving their needs. Here a survey is very helpful, but focus groups and interviews can also give you lots of good information.

So whatever your fundraising strategy is, you could be using data (quantitative AND qualitative) to support your efforts and make them more effective. Fundraising is hard work, so make it as effective and efficient as possible with data on your side.

Want me to write a blog post on a specific topic related to data and fundraising? Let me know!

Design & Evaluation: Radical Collaboration

Written by CRC on . Posted in Design

by Taj


(Pardon our silence over these past several months! After our unintentional hiatus, we’re be getting back into our blogging routine, sharing evaluation related news, tips, and tricks on a somewhat monthly basis. Starting with today’s post, the first in a series of posts about design and evaluation…)




Over the past few years, as CRC has explored and embraced visual thinking, information visualization, and the use of technology in evaluation, I’ve gotten a real world education in design, technology, and design thinking. I’ve done a lot of reading, even more experiments, and gotten an actual education in these areas through completing a Master’s in Information Visualization from the Maryland Institute College of Art.


Researchers have plenty to do. Keeping up with the current trends in their field(s), new data collection methods, the latest in propensity scoring, and the changing context of neighborhoods and communities, to say nothing of keeping an eye on changing funding priorities from foundations and government agencies … it’s a lot of work. So I wanted to make this part – incorporating design thinking – a little easier for you. Through this series, I’ll be sharing some insights to help you think differently about how you work, with hopes of starting a conversation about what the world of social sciences can learn from the world of design. (These thoughts are informed by many books, conversations, and conferences, but especially by the work of Don Norman and others in the Human Centered Design Field, that of the Institute of Design at Stanford, and of Tim Brown and his colleagues at IDEO.)  


This first post will focus on a principle of design thinking. Those that follow will talk about how we might adopt design frameworks, and take a look at Norman’s work on Human Centered Design, which examines how to concretely make things that people can and will use, and will actually enjoy using. (And if you’ve ever made a form that people hate, you know we can certainly learn a thing or two from Norman.)


Part 1: Radical Collaboration


There are many, many types of evaluation and research, and some focus on explicitly being collaborative, participatory, and/or empowering. But research is often a top down endeavor. It may require people like us, researchers and evaluators with advanced degrees and specialized skills. We know things others don’t. We know how to write survey questions, how to do representative sampling, how to conduct focus groups and analyze data. But non-researchers know things that we don’t. Radical collaboration involves acknowledging that, while we have some important specialized skills, we don’t hold all knowledge (in broad strokes, it means collaborating in a solutions-focused, action-oriented, rather than problems-focused way). In fact, this is the case even when we have all evident information.


Particularly when working with a program, we know that program staff have amazing insights into the research process. We do our best work when we find out why and how program staff interact with clients, especially around collecting information. They are the ones who can tell us whether our questions make sense, whether we are asking the right questions, or why no one is filling out that one field on the one form. They often know the best way to get information to us, and they also know what information they need from us.


They’re also excellent at helping interpret our data analysis. For example, we presented school-based health staff with a chart showing when students went to the health center. There was a huge spike in September. We all thought that was because students hadn’t been getting needed health care during the summer, so when they came back to school they went to the school nurse to get their health needs taken care of.


Fortunately, we kept our mouths shut and asked the staff what they thought the data pattern meant. They knew immediately. And we were so very off-base with our assumption (again … good thing we kept our mouths shut). “Oh, there is always a new school nurse in the fall, and all the kids ask to go to the nurse to see if she will let them out of class. She sends them right back to class, and by the end of September they stop trying to use that trick.”




But program staff are just the beginning. Working with survey participants can also be an opportunity for radical collaboration. Recently, we were working on a community survey, and we used cognitive interviewing to help us craft the questions. We had the potential respondents think out loud while they answered our initial questions, telling us what they thought we were asking them. This process highlighted areas where we clearly thought a question meant one thing, but these respondents interpreted the question totally differently. By simply listening to people, we were able to craft a survey that was much more valid BEFORE we sent it out to hundreds of people. A little collaboration saved us a lot of headaches.


Foundations also hold unique perspectives. Because they often have the resources and the staff to really dig deep into difficult social problems, they have insights into long-term and national trends, and the many intersecting factors that impact a particular issue. They may know the literature, the players, and all the sites nationwide who are working on the same problem, and what has worked for them and what has not. These insights can help fuel program development, implementation, design, interpretation of research and more. This deep knowledge can help smaller programs learn from larger efforts and avoid reinventing the wheel.


Radical collaboration means recognizing that everyone has the answers. It also means that everyone occupies a unique position and sees things based on where they stand. We may not always be wholly accurate, but everyone can offer a piece of the puzzle. Radical collaboration requires openness, a willingness to try new things and be open to new ideas, and to try out new strategies, even though they may not work out. But together the perspectives of all who are involved can create a more accurate picture of what the problem is and more innovative ideas about how to address it.


Check back soon for Part 2 in this series!

Tips & Tricks for Child Focus Groups, Part 2

Written by CRC on . Posted in

by Mandi Singleton




(Note: this post is the second part of a two-part series.)


As I mentioned in the my last blog post, one of my favorite things about my job at CRC is conducting focus groups. Focus groups with elementary school students can be the most challenging and the most fun for me as a focus group facilitator. Here in part two of my discussion of tips & tricks for doing focus groups with kids, I get into strategies that make for effective and enjoyable groups.


5. Make it fun with hands-on-activities! Studies show that incorporating hands-on activities in focus groups with school-aged children increases participation and stimulates discussion. In focus groups I’ve conducted, I led children in several hands-on activities as part of data collection. During one activity, children were given four paddles with faces on them (very happy, happy, sad, and angry) and instructed to hold up the paddle that reflected how they felt in response to statements read aloud.


Exhibit A. The Paddles

Exhibit A. The Paddles


Another activity I’ve done with kids involves them responding to statements by placing stickers on posters, which incorporated the same four expressive faces as the paddles.


Exhibit B. The Posters.

Exhibit B. The Posters.


Other non-verbal forms of response are effective for use with kids, and multiple types of queries can be used together in one group. For example, along with using the posters in one group I also asked the children to complete a drawing activity in which I instructed them to draw their favorite and least favorite things about their afterschool program.


Exhibit C. Time to Draw!

Exhibit C. Time to Draw!


Exhibit D. A Favorite Thing.

Exhibit D. A Favorite Thing.


Exhibit E. A Least Favorite Thing.

Exhibit E. A Least Favorite Thing.


Implementing all of the above hands-on activities has been successful for me, appearing to boost kid’s engagement and stimulating discussion. I’ve noticed that the sticker-poster activity has been more conducive, compared to the paddles, for eliciting honest responses (perhaps because it can be too much fun to wave different faces!). And the drawing activity has stimulated discussions that I believe would have never happened if children were only asked for verbal responses.


6. Watch the clock.. Response quality declines in child focus group sessions lasting longer than 45 minutes. To avoid participant fatigue and promote thoughtful responses, research suggests that focus groups involving school-aged children shouldn’t run any longer than 45 minutes and should include breaks for refreshments.


The groups I’ve facilitated have averaged 35-45 minutes and, although there were no breaks included, children remained attentive and actively engaged throughout the entirety of sessions. I attribute their attentiveness and active engagement to the short duration of the focus groups, along with the hands-on activities I included. Plan carefully for your choice of activity and timing, though, because they can take longer than you might expect. All-in-all, short time frames and activities have kept me on my toes as a facilitator but definitely kept the kids happily busy and more open to sharing information, too.


7. Watch for signs of distress! When conducting focus groups with young children, it is extremely important to maintain awareness of group dynamics even as you try to keep things fun and productively moving along. Young children can become easily distressed when discussing sensitive or personal topics.


For example, in my experience, I’ve had one student bring up bullying as her least favorite thing about afterschool programming. When this happened, efforts were made to ensure the student was in control of how much she disclosed about the bullying. When I inquired for more detail with follow-up questions, I was careful to ask if “any students in the program had been bullied” versus if she had been bullied. Formatting the follow-up question in this manner gave the student the option to choose how much she disclosed and enabled her to discuss the issue without it becoming too personal or distressing.


Concluding thoughts


I hope that my experiences and the strategies I’ve described help you in considering the key factors that impact child participant involvement, levels of engagement, and production of thoughtful responses during focus group sessions. Before conducting such focus groups, I had concerns about engaging very young children. However, contrary to how I imagined the groups would go, kids I’ve worked with have not been rambunctious or inattentive; they were enthusiastic and sometimes less focused, yes, but they were still active participants who were able to reflect on and effectively communicate their personal experiences. I’ve enjoyed seeing how excited children are to give me their opinions on issues.


Have you ever conducted focus groups with young children? Do you have any funny stories or suggestions? Please leave a comment and share your experiences with us!



Further reading:


1. Hearing children’s voices: methodological issues in conducting focus groups with children aged 7-11 years (Myfanwy Morgan, Sara Gibbs, Krista Maxwell and Nicky Britten, Qualitative Research 2002)


2. Interviews and focus groups with children: Methods that match children’s developing competencies (Gibson, 2012)


3. Focus on qualitative methods: Interviewing children (Sharron Docherty, Margarete Sandelowski, 1999)

Tips & Tricks for Child Focus Groups, Part 1

Written by CRC on . Posted in

by Mandi Singleton


(Note: this post is the first part of a two-part series.)


Elementary classroom. Focus on teacher standing in front of chalkboard.

First tip — don’t expect things to be this orderly!

One of my favorite things about my job is conducting focus groups. I enjoy the opportunity it gives me to interact with people, capturing and learning from their thoughts and feelings about experiences they’ve had. While at CRC I’ve had the opportunity to facilitate a series of focus groups with elementary school students.


Although many of my projects are education-related, I had never done a group with children so young before. The focus groups I’d done in the past involved middle grade students, parents, and school staff, so the thought of conducting focus groups with elementary school students made me a little nervous.


I could just imagine rambunctious 6 to 10 year olds, hopped up on sugar and far too excited to break away from their schools’ typical routines and reigns of control to participate in a focus group. I guess my main concerns in conducting focus groups with such young children were getting them involved, keeping them engaged, and capturing genuine but thoughtful responses.


Because school-aged children are still developing (physically, socially, emotionally, cognitively), the way they think, communicate, and interact with others differs from adults. These developmental differences point to the importance of identifying focus group strategies that are specifically catered to children’s communication competencies, as techniques used in focus groups with adults would not be effective. My overall goals for focus groups with young children are to ensure that the participants understand my questions, have the opportunity to reflect on their own experiences, and as a result can effectively communicate their thoughts and feelings.


Thankfully, I’ve found that by using the right strategies that my young focus group participants’ excitement eventually succumbed to attentiveness as the group format played to their inquisitive natures.


scsckids Here are some of the tips and tricks I’ve found to work for focus groups with children:


1. Be mindful of group composition. To increase involvement, levels of engagement, and quality of responses, research suggests limiting groups to four to six participants that are no more than two years apart in age or level of development. In my experience, I’ve been able to limit each session to six children; for example, one group was conducted with 1st and 2nd graders, while another only included 3rd and 4th graders. I found that limiting participants was beneficial in fostering engagement, while controlling for large age-discrepancies seemed to help prevent students’ responses from being overly influenced by their peers.


2. Build a trusting atmosphere and relaxed setting. Children are more likely to be engaged by focus groups that foster relaxed settings where they feel comfortable enough to express their thoughts and feelings. To facilitate this type of setting, research suggests that moderators use ice-breaker games, engage in casual (but age appropriate) conversation with participants before the start of the session, portray a friendly and relaxed manner, and encourage the use of first names. In my focus groups with children, participants were invited to do an ice-breaker activity at the beginning of the session, which did build trust between participants and helped them to relax. The students paired up with a partner in order to learn something about each other, and took turns introducing their partner to the rest of the group. The result was a relatively quiet group of children, more comfortable with each other, who then became more talkative in an appropriate way as the session progressed. Fostering a certain atmosphere when doing groups in schools is especially important; I’ve found it most effective for children to view me, as the moderator, in a more informal way than they do their teachers to encourage their honest responses.


3. Establish ground rules. Research suggests establishing ground rules at the start of each focus group, as they help children understand their role in the group, what is expected from them, and what they can expect from the moderator. At the beginning of each session, I’ve asked participants to abide by basic discussion rules (e.g. be respectful, be good listeners) and informed them why I wanted to talk with them. I let them know anything they said in the group would not be shared with anyone else with their names attached, and that they didn’t have to respond to any questions they didn’t want to. Before starting the focus group, children were also given the opportunity to ask any questions they had. Note that challenges have arisen for me in soliciting honest responses; this occurred when children observed peers and wanted to model and/or conform to peers’ responses. However, I was able to resolve these situations by varying my methods (more about this in Part 2).


4. Consider your Interview structure and question formation. Research on focus groups supports that groups with school-aged children should start off with simple questions that can be answered with brief one word responses (e.g., yes or no) and progress to more complex or multipart questions. This eases children into the interview process, making them more comfortable with responding to the moderator. The full focus group guide should primarily consist of open-ended questions, with direct questions only used as a means to clarify or elicit more detail on a response. Close attention should be paid to the wording of questions to ensure age appropriateness and that students understand what they are being asked. In groups I’ve facilitated, children were read statements that probed for feelings about their social lives and interests in math, reading, and science. The focus groups started with a few warm-up questions that asked about their feelings towards vanilla ice cream and rainy days; not only were these questions helpful in getting the children comfortable with the interview process, but they also reassured me as the moderator that the children understood how to correctly respond using tools I provided to support non-verbal responses to augment verbal ones (more on this, also, in Part 2). Responses elicited during a drawing activity, for example, were followed-up with more direct questions in an effort to stimulate additional discussion and gain further insight.


I hope that the above tips give you some food for thought and a starting point for your data collection with this unique population. Stay tuned for Part 2, including how to engage your groups with FUN activities, coming later this month!

Visual Reports

Written by Taj Carson on . Posted in , Dataviz, Technology and Customer Service

By: Sheila

Several weeks ago, one of our clients came to us with a challenge: find compelling ways to present 10 years of grantmaking data. The client wanted us to tell their story and present the data in a way that people at all levels (data nerds and non-data nerds) at their organization could easily understand.

I was tasked with analyzing the data and worked closely with the CRC dataviz experts, Taj and Matthew, to come up with the different visuals for the report. I’m no dataviz expert but here’s what I learned:
  • Client feedback is important: Take time to hear your client’s thoughts on the visualizations you are creating, you want to make sure you are meeting your client’s expectations.
  • Patience is key: I spent a lot of time creating and re-creating multiple charts and graphs. It takes time to make sure every visual aligns with the story you are trying to tell. If you need a break, eat a muffin.
  • Don’t be afraid to ask for help: If you get stuck, ask a colleague or check out online forums to see if there is a solution to the problem you’re having. Google is a great friend.
  • Sketch!: Take time to sketch out what you want your visuals to look like. Trust me, it’ll save you a lot of time in the end.
  • Don’t be afraid to try: I made about 40-50 visuals for this project. Around half were rejected by the design team but I learned a lot throughout the process:
    • Pie charts are not your friend
    • No one at CRC likes pink or mustard yellow
    • Embrace white space
    • Not every visualization needs to be a bar graph
    • Embrace awesomeness

In the end, we created a pretty cool report for our client that they really liked. Since we can’t share the report online for proprietary purposes, we created a similar report for this blog. Take a look and share your thoughts.


Also, don’t forget to check out our summer webinar series:

Data Systems: Where and How to Store Your Data

July 15, 2015 :12-1pm

Sarah will explore different options for data storage systems and solutions for those who seek to streamline their data collection and storage processes. This learning session will focus on selecting the best software for your specific needs and organizing your program data.

Microsoft Excel Magic: Developing Mesmerizing Charts that Enchant & Engage your Audience

July 29, 2015: 12-1pm

This webinar will focus on the effective visual representation of data using charts that can be created in Microsoft Excel. The strategic use of design principles and the practical skills needed to create charts that are visually appealing, functional, and informative will be addressed

Each webinar costs $25. Sign up here:

And this great map of outdoor film locations in Baltimore created by Matthew:

Happy Summer!


What Breastfeeding in the U.S. Looks Like

Written by CRC on . Posted in

CRC’s dataviz team recently completed a comprehensive and beautiful infographic documenting breastfeeding statistics in the United States. Our hope is that this infographic can play a part in spreading the word about this important issue.


Screen Shot 2015-05-29 at 10.47.55 AM


From a public health standpoint, the medical benefits of breastfeeding are well established.* Breast milk provides babies with all the necessary fats, proteins, and vitamins they need for healthy growth and development. Among other benefits, antibodies in breast milk can help babies fight infections and reduce the risk developing asthma and allergies. Moms who breastfeed experience benefits too. Breastfeeding can help mom lose her pregnancy weight (through the calories it burns) and can protect her against breast and ovarian cancers. In addition to the physical benefits, time spent breastfeeding also helps nurture the bond between mom and baby.


See the full infographic here:!breastfeeding-in-the-usa/ccrt


* Although the above benefits of breastfeeding are established in current research, we would like to acknowledge that other bodies of research demonstrate that babies who are not breastfed also can have healthy outcomes and bond well with moms. Many moms cannot or may not choose to breastfeed for a wide variety of reasons, and we support all moms in their choices.

We LOVE Maps: Map out your summer!

Written by CRC on . Posted in

As we’ve said in the past, we at CRC LOVE MAPS. They’re useful and (often) beautiful, helping us to make all kinds of decisions in research and daily life.


Some of you have been following our interest in maps at the Baltimore DataMind blog, but to make sure more of our readers get to see that content, starting with this post we’re “folding” the BDM blog in here. So now, along with the evaluation news, data tricks, and dataviz tips you’ve come to expect from CRC’s blog, expect to learn more about making and using maps. We also hope to show you a lot of just plain cool ones, starting with this one, created by our own Matthew Earls.


Baltimoreans’ love of festivals is possibly even greater than our love of maps! This map uses an interactive and chronological format to map out all the festivals in-town this summer. Use it to map out your summer plans!



A Word About Baltimore City’s Snappy Budget Graphic….

Written by CRC on . Posted in

By: Taj
Infographics are all the rage. They are beautiful, engaging, and fun to look at. This one is no exception:    
Available for download here:

Available for download here:

At first glance, it looks like a lot of fun. The Finance Office has done something unexpected, which is trying to make understanding the budget of Baltimore City a bit easier by using data visualization. It is likely to be successful in that more people will look at this than might read a website that breaks out funding by categories, or talks about the property rate; but, it also leaves a lot to be desired.

Here are a few things we would suggest to the City of Baltimore for next time:

Pie charts
Be careful with them. Unlike other data visualization folks, I’m not completely opposed to the pie chart in all situations. But, if you are going to use them, you should have the slices go from the largest (property taxes at 32%) to the smallest (other at 4%) slices. This one does work okay because you don’t have a million slices.

The colors seem randomly chosen (except for the green tree and blue water) and the color palette on the top half of the infographic looks unrelated to the bottom half. Pick a few colors and stick with them throughout so the piece looks consistent. Consider gradient colors (i.e., a color fill that gradually blends from one color to another) for the Priority Outcomes.

The bottom graphic is hard to read. It might be the case that the color of the Priority Outcome is supposed to match the color of the pie slice, but it’s not clear. Are the colors communicating information or if is this just the color palette? If the former, you made a good choice as a way to unify the piece, at least the bottom of the piece. However, a bit of labelling of the bottom pie chart (like the top one) would have cleared this up.

And… What’s with the arrows? What are they pointing at?

We applaud the Finance Department for using data visualization to make the budget more accessible to citizens. And we love the clean look and feel of this piece. We’d love to see them use some of the best practices in the field of data visualization to make it even more impactful!

As a final note, thank you for NOT putting the City seal on this.