AEA 2014 Recap

Written by CRC on . Posted in , CRC Team

by Mandolin Singleton visionary_eval


Last month I attended the 2014 American Evaluation Association conference in Denver, CO. 2014’s conference theme was “Visionary Evaluation for a Sustainable, Equitable Future.” The event brought together research and evaluation professionals from all over the globe and from a variety of disciplines (e.g., community psychology, health and human services, PreK-12 educational evaluation). Attendees were encouraged to explore ways in which evaluation could be used to support sustainability and equality across disciplines and sectors.




This year’s conference was especially exciting (as well as nerve-wrecking) for me because I was attending as a first time conference presenter. I went to numerous sessions, learned a lot, and had a great time connecting with other evaluators. (I even found a little bit of time to explore Denver’s spectacular shopping scene). Below are some of my highlights from the conference.


    Robert Kahle: Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful wallflowersFocus Groups
Robert Kahle is a sociologist and expert in qualitative research. He is versed in leading skill building sessions for both new and experienced focus group moderators. In this workshop, he talked about how to effectively manage focus group dynamics, identified problem behaviors typically observed in groups, and reviewed strategies to recognize, prevent, and address them. I found this workshop especially informative and will be applying some of these techniques in my upcoming focus groups. If you’re interested in learning more about what was reviewed in this session, much of the content can be found in Robert’s book, “Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful Focus Groups”.





    Veena Pankaj: Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results
Veena Pankaj has experience directing evaluation design and implementation with a focus on participatory approaches (you can check out one of her recent publications on participatory analysis here). Pankaj described a data visualization technique (Data Placemats) she uses to engage, improve understanding, and solicit stakeholder interpretation of evaluation results. She talked about the logistics of the technique (e.g., the what, when, and how) and reviewed the learning journey involved in their creation. If you’re interested in Veena’s work, you can find slides and resources from the session on SlideShare.


My Presentations


Deciding to one-up myself by giving not only one, but two presentations my first go around didn’t really help my nerves, but what can I say, I was enthralled by this year’s conference theme (you could probably also say there was a little part of me trying to impress the boss as well). I gave a poster presentation on an infographic that we (CRC) created for Elev8 Baltimore in effort to visually display evaluation findings. The poster reviewed the process, results, and implications of translating data findings into a reader-friendly infographic.


Poster session reception

Poster session reception


My poster implied that infographics can be successfully translated into attractive, functional, and informative infographics. It suggested the use of infographics to report evaluation findings, as effective data visualization can attract readers, aid in interpretation of data, and support comprehension. Perhaps most importantly, it implied infographics can be used to promote the use of evaluation findings to inform decision making!


Me and my poster!

Me and my poster!


I also gave a paper presentation on our (CRC’s) establishment of an early warning system at the Elev8 Baltimore sites, and reviewed the application of early warning indicators to the middle grades. Within the presentation I gave a brief description of the system and reviewed the steps we took to create it – including everything from gaining access to the data to producing the final reports. In this presentation I described how early warning indicators can enhance the accessibility and use of evaluation data; through providing reports to sites on a quarterly basis, the early warning indicators can be used on a real-time basis to inform programmatic decisions. Implications were also made for expanding the use of early warning indicators from high school to middle school populations.


Between the multitude of great sessions I attended, learning loads of information, giving two presentations, and still finding time to scope out the shopping scene in Denver, you could say I had a whirlwind of a time at AEA 2014. For more pictures from AEA 2014, visit AEA’s Facebook page.

Secrets from the Data Cave: November 2014

Written by CRC on . Posted in Dataviz

by Sarah McCruden

 Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

Visualizing Nonprofit Data: Tell the Real Story by Using Your Program Knowledge

  (This blog post originally appeared last month as a guest post for the Maryland Association of Nonprofit Organizations.)
  Many nonprofit organizations rely on in-house staff members to crunch numbers and create reports for their program data. This means that, in some cases, those who are inexperienced at turning heaps of data into charts and graphs will find themselves stuck with a daunting task: creating meaningful data visualizations for their program. Now, there are a few basic tenets of data visualization to which everyone should adhere. For example, all pieces of a pie chart need to add up to 100%. But let’s say you know the basics already. What else might you want to keep in mind when choosing a visualization for your data?   My top suggestion: never forget that YOUR knowledge of your program and the populations you serve can make all the difference when it comes to presenting your results. So, if you think your data visualizations are not showcasing your program results in the way you expected, try to determine why they might look this way.   There’s more than one way to visualize data.   Let’s look at a fictional program as an example: say I am reporting on outcomes a program serving adults struggling with addiction. I want to show that clients who participate this program are not only more likely to start drug & alcohol treatment, but also to successfully complete the treatment program as prescribed, than those in a control group of equal size (who did not participate in the program).When I look at the results, however, I am disheartened. It appears as though there is very little difference between the two groups in terms of who is more likely to complete treatment:     Yet anecdotal evidence from my fictional program (talking to program participants and non-participants, etc.), tells me that those in the program do indeed seem more likely to overcome their addictions, stay in treatment, and stay clean. So what went wrong?Well here’s something I had not considered: what if most of the addicts concerned in both groups cited heroin as their drug of choice. In this case, their prescribed drug treatment would likely be an opioid maintenance program, i.e. methadone maintenance therapy (MMT), as this is commonly used in the treatment of opioid dependence. MMT is a treatment that can go on for years, in some cases. So the individuals on MMT would not technically be considered to have “Successfully Completed Treatment,” since they are not yet finished with treatment, but many have abstained from drug use for the entire time and have reformed their lives such that had they not been on MMT, they would be considered “Successfully Completed.”   To remedy this, I use the same pie charts—but this time, I give a breakdown of the portion of clients that did not successfully complete treatment to show those who are still in MMT:   Data Visualization 1   This visualization shows that while the percentage that technically completed treatment is almost the same for the two groups, a large portion of program participants who did not complete treatment are still on Opioid Maintenance. In the non-participant group, the vast majority of those who did not complete treatment had “other” reasons for this (like dropping out of treatment early). This gives a completely different perspective on the same data, and shows just how powerful your choice of visualization can be. While both examples are accurate representations of the data, one is simply more effective at showing the program results.  

So remember, YOU know YOUR program better than anyone else does. Use that knowledge to choose a visual representation that shows the world what your program has achieved!

Secrets from the Data Cave, October 2014

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!


Access vs. Excel: Which Will Reign Supreme (for your storage needs)

Access and its less showy cousin, Excel, are both good options for data storage. In this installment of Secrets from the Data Cave, I’ll highlight some things to consider when deciding between using Excel and Access for your data storage needs.

IBM.CardComputing.19xx.102645452.lgI should start by saying that Excel CAN do a lot of things that Access does. For example, I’ve built some incredible automatic scoring programs in Excel. However, in some cases the use of Excel may prove (much) more time consuming because of all the formulas and manipulations one would have to use to get the same results that Access would very quickly get to.

Then again, if you do not know how to use Access already, it can be intimidating to learn a new program. If that’s you, then I would encourage you to check out this free, 12-part tutorial on Youtube to learn the basics of Access. You can find the first video here.


A brief rundown of key considerations

EXCEL works well for:

  • Computing aggregate totals from a single, flat data source (an example of this would be answers given to a single questionnaire or a table of demographic info)
  • Computing totals and/or organizing information where you have a single common identifier on everything (e.g., if you have a bunch of forms completed by clients, but every form has the client’s driver’s license number on it)

Please note that even in the above cases, you’ll need a pretty firm grasp of how to write Excel functions in order to get the results you may need.

ACCESS works well for:

  • Integrating many different data sources relating to one central population (e.g, many different datasets that all give information about the clients your program serves)
  • Computing totals and organizing information when you have many different identifiers (such as if some of your forms have clients’ driver’s license numbers on them, but other forms just have the last 4 digits of their SSN, you need a relational database that can match up your client list to any and all IDs that correspond to particular people)

And when in doubt, feel free to ask! Drop me a line at with some details about your data and I’d be happy to suggest whether I think you’d be better off using Excel or Access for your project.

photo 1

What I did on my summer vacation

Written by CRC on . Posted in

by Jill Scheibler  
photo 1

Hint: It involved both literal and metaphorical roller coasters.

Today— with suntans fading and schools back in full swing— it’s a few days into fall and I can definitely feel it! It’s gloomy outside and I’d like nothing more than to revisit my summer vacation. In my role at CRC I wear a number of different “evaluation hats”, and otherwise keep busy throughout the year teaching courses at a local university and directing a small arts nonprofit called Make Studio. When summer rolls around I am very eager to escape my not-quite-9-definitely-later-than-5 schedule for some fresh air and sunshine. Yet I don’t necessarily want to shut off my brain or escape the things that excite me about my work. So, this year I went to summer camp for nerds. By that I mean I was beyond thrilled to be accepted to and attend an event called MuseumCamp. Per the event’s leader, the innovative and fearless Nina Simon, author of Museum 2.0 and The Participatory Museum:
 “MuseumCamp is an annual professional  development event at the Santa Cruz Museum of Art & History in which teams of diverse, creative people work on quick and dirty projects on a big theme. This year, the theme was social impact assessment, or measuring the immeasurable. We worked closely with Fractured Atlas to produce MuseumCamp, which brought together 100 campers and 8 experienced counselors to do 20 research projects in ~48 hours around Santa Cruz.”

photo 4

  In this second year of MuseumCamp, the event brought together evaluators (like me) and arts professionals (also like me), as well as artists, performing arts group staffers, museum professionals from museums large and small, and more! Attendees came from throughout the U.S., and from as far afield as Sweden and Wales.
Here we all are in our SC-appropriate, official camp shades.

Here we all are in our MC official shades.

  At the end of camp Day 1, in which campers got acquainted with one another and we participated in informative workshops from the likes of Ian David Moss of Fractured Atlas and Barbara Schaffer Bacon from Americans for the Arts, 100 campers were broken down into 20 teams. These teams were charged (well, more accurately selected, via a demented white elephant process) with designing a research project to measure the impact one of dozens of arts and culture events happening during their stay in Santa Cruz. My team, who later became known as the “JerBears”, studied social processes at a Jerry Garcia tribute concert. (Conveniently for us, it was Jerry’s birthday at the time; inconveniently for me, I’m not a fan of the Grateful Dead.)
Here I am, on the left, with two teammates. I think I'm hiding my lack of enthusiasm for jam bands quite well! (Read more about our project here:

Me (left) with 2 teammates. I think I’m hiding my lack of enthusiasm for jam bands quite well! (Read about our project:

  Other teams conducted studies involving a wide variety of local events and sites, including SC’s First Friday art walk, the famous “Steamer’s Lane” surf area, a mental health agency art exhibit, and a hard rock show at the boardwalk. Summaries and photos of all the projects can be viewed here.   Planning and executing on our projects was challenging at time, but in-between work sessions we were lucky to enjoy views such as these…. SC_strip

 [Side note: If you’re a child of the 80’s such as myself, you know that Santa Cruz is more than just a scenic if a bit grungy beach town, it’s also where this cinematic treasure was filmed.]

  So, aside from the excitement generated by enjoying copious amounts of sunshine, late 20th century nostalgia, and sea lion viewing, how did coming together in Santa Cruz allow campers to overcome their entrenched ideas about “research” and “measurement” in order to actually complete 20 research projects in 48 hours? Once again, per Nina S.:
“We encouraged teams to think like artists, not researchers. To be speculative. To be playful. To be creative. The goal was to explore new ways to measure “immeasurable” social outcomes like connectedness, pride, and civic action.”      

Tasked with the tricky task of convincing unwitting study participants to wear “JG party favors”.

 For me, thinking as an evaluator, arts program director, and occasional academic, I was truly impressed by what all of the groups were able to accomplish within such steep time constraints, limited resources, and unusual circumstances.

Although frequently fun and even silly, camp was often grueling mentally and physically as we tried to pair lightning quick learning with professional networking and (literal) construction of research tools.

After being a JerBear, I particularly took away the following things from camp:

1. In determining indicators and proxies for complex phenomena, don’t be afraid to wander a bit. It was really helpful for our group to “get lost in the weeds” even though it felt frustrating at times. We started by thinking too big, then focusing down too fast, blew things up again after that, and then finally re-focused in to arrive at a realistic, but worthwhile, set of indicators.

2. Remember that there can be is tremendous value in being unconventional and even weird in your tactics, particularly to connect with audiences and even research participants. (See Nina’s find re. “getting weird” here.)

 3. Stay  open-minded about problematic or limited results – they can still be revealing about some part of your research question or informative for designing better efforts that get you closer to where you’d like to be.

4. Trust in your ability to work productively with an eclectic team— sharing goals and interests but absolutely no history can be a good thing and creates openings and (controlled) conflicts to stimulate new approaches. I will definitely seek out more opportunities like this in my daily work!


MuseumCamp 2014 officially closed with an all-hands debrief about all of our projects, facilitated by Alan Brown of WolfBrown, an evaluator experienced in measuring social impacts of the arts.

Alan Brown also really knew how to rock a sombrero.

Alan Brown really rocked a sombrero.


But it was not really over until a rousing sing-a-long of “We Are the Champions”.

The JerBears... an unlikely but harmonious combination of an evaluator, two museum staffers, me, and a theater company folk. (Guess who was the biggest JG fan!)

JerBears… an unlikely combination of an evaluator, two museum staffers, me, and a theater company folk. (Guess who was the biggest JG fan!)

  And that is why, if for no other reason, you’ll find me in Santa Cruz at summer camp for nerds again next year!  

I’m a bit late to this summer camp recap party! Please take a few minutes to read some of these posts by fellow campers, which are far more eloquent:

Baltimore Data and Evaluation Meetup

Written by Taj Carson on . Posted in



The Baltimore Data and Evaluation Meetup, recently created by CRC, is a group for people working at nonprofits, foundations, and government agencies who are interested in collecting and using data to improve their programs. Whether you are trying to figure out where to start, wrestling with providing data to funders, figuring out what outcomes you should be measuring, or analyzing and reporting on the data you already collect, this meetup is for you. All levels of expertise are welcome. Participants will discuss the issues they are facing and share ideas and resources in order to practically solve problems. At least one skilled evaluation professional will be present at every session. We are hoping to develop a group where people from different organizations can bring questions about evaluation, share ideas, and build a community around data collection, outcome measurement, and reporting.

Please join us for our first meeting on Wednesday, September 3, 2014 from 8:00 to 9:00 AM at Maryland Nonprofits, 1500 Union Avenue, Suite 2500, Baltimore, MD. We plan to have a discussion about potential topics for the rest of this year’s meetups. A light breakfast will be served.

Don’t forget to RSVP here.

For more information, please email or connect with us through our social media channels on Twitter and Facebook .


CRC’s “Dumbphone” User

Written by CRC on . Posted in

by Tracy Dusablon


Each CRC staff person is assigned a month in which to write a blog – this month it was my turn.

At first, I was wracking my brain to come up with something instructive, like in my colleague Sarah’s series Secrets from the Data Cave, or hip like Sheila’s post about Data Driven Detroit.

Instead, I decided to write about something  that sets me apart from my co-workers, and tell a little story about our office in the process.


The other day I was checking office voicemail online (we have an internet phone system) and came across this funny-looking icon. I noted how non-self-explanatory this icon was and, out of curiosity, asked a few co-workers if they knew what it meant. The conversation went a little something like this:

Me: “Hey, does anyone know what this ridiculous icon that looks like 110 camera film is?”

old film

Co-worker #1: “Seriously?……..That’s the international voicemail symbol – it’s been around for decades.” [Note: co-worker #1 is in her early 20’s]

Me: “Decades……. really?”

Co-worker #2: “Haha, co-worker #1, you haven’t even been around for decades! But yes, that is the voicemail icon.”

 Co-worker #1: “Well, it’s the only voicemail symbol I’ve seen in my entire life. It’s on EVERY cell phone”

 Me: “Well, it isn’t on mine. When I have a voicemail on my cell, it looks like a phone handset.”

 Co-worker #2: “No way, you have it – you just don’t know.”

 Me: “Call me and leave a voicemail…I’ll prove it to you.”

This conversation continued. Co-worker #1 called and left a message on my phone. Sure enough – no 110 film icon appeared; just an old-school phone handset (much to everyone’s shock and amusement). Another co-worker chimed in this time, looking over her shoulder and brushing away tears of laughter from her eyes……“OMG, I had your phone in like 1996!”

Admittedly, I’m the technology dinosaur in our office. I’m in my late 30’s and the proud owner of a “dumbphone”. I also stay away from Facebook, Twitter, Instagram, and all the other social media that I know nothing about. I’ve faced my fair share of ridicule in the office because of this, but the voicemail conversation took the cake.

I’ve been asked, “In our work environment, how can you NOT have a smartphone?” My answer, most simply is….“I don’t want one. Well, not yet.” It’s not that getting a smartphone has not crossed my mind, but I’m ambivalent about it and I never considered owning one until I started working here over a year ago. So at this point, I’m weighing the pros and cons. Here is my list so far:


  • Email access anywhere, anytime.
  • Internet access anywhere, anytime
  • Capturing video and still photos of local Hampdenites sparring outside our office windows
  • Apps claiming to organize and simplify my life

Consphone screen

  • Email access anywhere, anytime
  • Cost
  • Learning curve
  • Having a phone dictate my life
  • Auto correct (I have nightmares about sending inappropriate emails to clients. With my luck, I’d have the next contribution to the website Damn You Auto Correct)


The cons still outweigh the pros for me right now; I’m just not ready for a smart phone quite yet. Plus, have you read the article recently published in Science[1] about people who would rather shock themselves than be without their phones or other devices? I’m not itching to jump on that bandwagon!

Anyway, I like to think I make out just fine without a smartphone. I’ve never missed a meeting, I meet my deadlines, and have a means for getting in touch with people and for people to get in touch with me. Everyone has their own style. Mine just might be a bit more old-school than others. I mean, really, I DO text!



(1) Source article: Wilson, T.D. et. Al., Science 4 July 2014: Vol. 345 no. 6192 pp. 75-77. 




CRC Takes Detroit

Written by Taj Carson on . Posted in

By Sheila Matano

This past week, Taj and I visited Detroit to meet with Erica Raleigh at Data Driven Detroit (D3) and also took some time to explore the city.

Data Driven Detroit

D3 is a National Neighborhood Indicators Partner (NNIP) and an affiliate of the Michigan Nonprofit Association (MNA). Created in 2008, D3 houses a comprehensive data system that includes current and historic demographic, socioeconomic, educational, environmental, and other indicators. In addition to providing access to high-quality data, D3 also provides services such as data analysis and data visualization to organizations in Detroit.  For example, they created the One D Scorecard to show how Detroit compares to different regions across the United States. (See other examples of D3’s work.)

Pic 1


 D3 Building



 D3’s Mission



Founded in 2011, Shinola is a Detroit-based company that produces watches, bicycles, leather goods, and journals. And yes, the company got its name from the 1940s colloquial “You don’t know shit from Shinola”. Every Shinola product is made in the U.S., which is pretty great since no American watchmaker has produced watches at scale since the late 1960s! Currently, their factory has the capacity to produce 500,000 watches a year.






Sightseeing – Detroit Style

We had some time to drive around and check out some of Detroit’s tourist attractions and historic buildings. Matthew, our resident map guy, made us a handy interactive story map to use while on our trip.



Although we couldn’t get to all the sites because of limited time, thanks to Matthew’s map we were able to take an interesting self-guided tour around the city! (Full photo album is available on Facebook.)



The Michigan Central Depot where some scenes from Transformers were filmed.



The Detroit Industry Murals by Diego Rivera at the Detroit Institute of Arts Museum


The world’s largest Masonic Temple



 Fox Theater


Detroit, like Baltimore, has a lot more going for it than people might think. This summer, CRC plans on exploring whether negative perceptions of Baltimore City (such as those perpetuated even by media we love, like The Wire) are actually supported by data. And we’ll be using our own Baltimore DataMind to do it. Stay tuned! We’ll be sharing some information soon.




Written by CRC on . Posted in CRC Team, Dataviz

by Sheila Matano

Last week, I attended the EYEO festival for the first time. EYEO is unique in that it brings together experts from a wide variety of fields (e.g. computer science, engineering, data design, cartography, etc.) to showcase their work.  There were a number of great presentations, and below are some of my favorites.


Sarah Williams: DigitalMatatus, Visualizing Informality

Sarah Williams is currently an Assistant Professor of Urban Planning and the Director of the Civic Data Design Lab at the Massachusetts Institute of Technology’s School of Architecture and Planning. The Civic Data Design Lab employs data visualization and mapping techniques to expose and communicate urban patterns and policy issues to broader audiences. In her presentation, Sarah talked about how her team worked with Kenyan Universities and Nairobi’s growing technology sector to collect data on Nairobi’s transit system which is mostly made up of matatus. As a Nairobi native, it was pretty awesome to hear how Sarah and her team used this information to develop mobile routing applications and design a new transit map for Nairobi that changed how both the residents and government navigate the system.




Nicholas Felton: Too Big to Fail

Nicholas Felton is famous for his personal annual reports that incorporate different dataviz techniques to reflect his work. The image below is from the Feltron 2012 Annual Report.


In his presentation, Felton described how he attempted to capture a year of his communication exchanges in 2013 including conversations, phone calls, physical mail, email, texts and chat messages. He also talked about the methodology, privacy issues and design challenges of working with this dataset. You can keep up with his work on his blog.




Tahir Hemphill: The Rap Research Lab

Tahir Hemphill is a multimedia artist working in the areas of interdisciplinary collaboration, thought and research. He manages the media arts education program for the Rap Research Lab-a place for teaching art, design, data analysis and data visualization to students from the Bronx using his project based curriculum which visualizes Hip Hop as a cultural indicator.  During his presentation, Hemphill talked about his work in the semantic analysis of rap lyrics and how he used a robot to create visualizations by mapping the locations rappers mentioned in their music.










Micah Elizabeth Scott: Blinky lights for STEAM

Micah Elizabeth Scott  talked about her experience working with both hardware and software. She has been doing unconventional things with technology for as long as she can remember and has built satellites, robots, virtual machines, graphics drivers, CPU emulators, networking stacks, USB controllers, reverse engineering tools, and pretty much everything in-between.  In her presentation, Micah talked about her work in using LEDs to bridge the gap between technology and art, and the potential this new medium has as an open-ended educational tool. This year, her and her team at scanlime took a cloud to burning man on a forklift.




Jessica Hagy: Tiny Data

Jessica Hagy is an artist and writer best known for her Webby award-winning blog, Indexed. A fixture in the creative online space, her style of visual storytelling allows readers to draw their own conclusions and to actively participate in each narrative. She mixes data (both quantitative and qualitative) with humor, insight, and simple visuals to make even the most complex concepts immediately accessible and relevant. During her presentation, Jessica shared some of the humorous stories behind her visualizations.












Taj Carson: Everyone deserves beautiful data

Taj gave a great ignite presentation on why everyone deserves beautiful data. She gave us some insight on how data visualization can be made accessible to people or organizations who don’t have a lot of resources. We will post a link to her presentation when it’s available.




Northern spark festival

Northern Spark is an all-night arts festival that happens on the second Saturday in June each summer. Tens of thousands of people gather along the Minneapolis riverfront and throughout the city to explore giant video projections, play in temporary installations in the streets, and enjoy experimental performances in green spaces and under bridges. I had a great time exploring the Minneapolis art scene, below are some of the installations. 

eyeo12 eyeo13  
















The Clock is a major cinematic work by New York–based artist Christian Marclay. The Clock samples thousands of excerpts from the history of film that indicate the passage of time—from clock towers to wristwatches to buzzing alarm clocks—that the artist has edited together to unfold on the screen in real time as a 24-hour montage. 


Ben’s Window, Artist: Ben Vautier




Below is a list of cool resources compiled from our time at EYEO:

Scratch is a programming language and an online community where children and adults can program and share interactive media such as stories, games, and animation with people from all over the world.

Duolingo is a free language-learning and crowdsourced text translation platform. The service is designed so that, as users progress through the lessons, they simultaneously help to translate websites and other documents.

Mapbox is an open source mapping platform for developers and designers.

Lynda is an online learning company that helps anyone learn software, design, and business skills to achieve their personal and professional goals. With a subscription, members receive unlimited access to a vast library of high quality, current, and engaging video tutorials.

For more eyeo pics, check out the CRC Facebook Page

Secrets from the Data Cave, May 2014

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

May 2014:  The Difficult Database, Part 1: The Data Monster

            In my experience working with relational databases, I’ve seen it all: the good, the bad, and the very ugly in data-keeping practices.  I’ve learned a lot about wrangling data in an unruly database. While plenty of problems I see are caused by a complicated combination of elements and require time-consuming fixes, sometimes the issues are simple and could be addressed relatively quickly and painlessly by program directors or database managers. Over the next few months I will review a few of these common problems, what might be causing them, and how I would address them in a mini-series of posts regarding “The Difficult Database.”

This month, I’ll talk about a very serious problem plaguing organizations everywhere . . . THE DATA MONSTER!

The Data Monster, as portrayed for Ashley Faherty circa 2009. Ms. Faherty kept the sketch as a reminder of the trouble the Monster can cause. Note: Not drawn to scale.

The Data Monster, as portrayed for Ashley Faherty circa 2009. Ms. Faherty kept the sketch as a reminder of the trouble the Monster can cause. Not drawn to scale.

The Data Monster is a seldom-seen creature that sneaks into your database/case notes/paper records under cover of night and eats all the data. You’ll know you have a Data Monster if, when it comes time to pull a report out of your database, many of the records for your clients/patients/students are mysteriously missing. You know the work was done, and the efforts of your team should be reflected in your report—so where did the data go? The Data Monster ate it. That is to say, you have no idea where it went and why it’s not on the report. So you suffer from:

The Common Problem: No data/incomplete data in your database and on your reports, or inexplicably low participant counts for some measures.

Possible Cause(s) and Ways to Address Them:

  1. Records are not being entered by program staff at all

If you suspect that entire records are not being entered AT ALL (so, for example, someone filled out an intake questionnaire on paper for Suzie Q., who is enrolling in your Workforce Development program, but now months down the line Suzy Q. doesn’t have an intake in the database at all), here are some things to consider:

  • Are you setting your database fields to “required” for too many answers on a very long form? While this may seem like a good way to prevent incomplete data, this can actually backfire if used excessively. It requires program and data entry staff to enter every single field all in one sitting before they can submit the form (and I’ve seen forms with hundreds of fields to be completed, which could take hours to enter). When a form like this meets program staff who already have a full day of tasks to complete, it’s easy to see why data entry for something that will eat up such a large chunk of time might get put off until later, which unfortunately can translate into much later, or never, because there is a continuous stream of new records coming in that will also require entry. Even if staff have the option to save part-way through the entry process and return later to enter the rest, you’re likely to get a lot of saved partially-completed forms that will never make it to submission/completion, which is only slightly better than not having them entered at all.
  • I would suggest: Think about what should REALLY be required on that form, and adjust your field settings accordingly. First and last name? Most likely need that information completed. Mother’s maiden name? Probably not something you should require. Yes, in a perfect world, every field would be completed every time, and there are likely certain things that funders ask you to collect, but you need to think about the bare minimum of necessary information, and make those the required fields. That way, at the very least you will have some data to show on those participants, as opposed to none when their data is never entered, or their form is never completed and submitted.

  2. Records for a participant are in the database, but the fields you need for data aggregation are blank/incorrect

Sometimes participants will have a record, but won’t show up where they are supposed to/don’t have the correct values filled in for certain fields are not being entered (so Suzie Q., who is enrolling in your Workforce Development program, has an intake form, but things that should be filled in are blank or wrong), here are some things to consider:

  • Are the data entry staff the same as the data collectors themselves? If not, is there an understanding between the two of how certain values should be entered? While some programs are set up such that the same person who would collect the data on paper would enter it into the database, many rely on designated data entry staff to do the electronic entry component. As someone who started out as a Data Entry Specialist, I have seen many cases where the person who filled out the paper form would leave a field blank, either because they were doing some type of task that involved them being quick in their observation/note taking (for example, observing motor skills in young children, who will not perform on command), or because it was not applicable and/or the correct value was assumed to be common sense or obvious. As a data entry professional, I entered things exactly as they appeared on the paper, especially since I did entry for a lot of standardized tests on which I wasn’t qualified to “assume” anything. Ideally one would ask for clarification if something is blank, but that is not always an option, especially if the person who completed the form is very busy. Thus, you might end up with blank fields in your database because, for example, the person completing the form left a field blank and assumed it would be filled in during entry, but it wasn’t.
  • I would suggest: The best way to deal with this is to have a meeting between those who complete the fields on paper and those who do the entry, and agree upon what is to be entered when a field is left blank. If there are exceptions to the rule, they need to be explicitly stated because ultimately, though you might want people to just use “common sense” in their entry, you also wouldn’t want them to assume something in error and cause the wrong value to be entered into your database. Communication is key to resolving this issue.
  • Does your form and/or database contain items with forced-choice values that are too close in meaning? If you have a drop-down menu in your database, or multiple choice item on a form, where two or more values are too close in meaning, you may end up with some program staff choosing one option, while others choose a different one, in cases where the value should be the same. For example, if your questionnaire asks what type of government assistance a participant receives, and you have one option for “Utility Assistance” and one for “Water Bill Assistance,” some program staff might assume that water is a utility and thus belongs under utility assistance. Then, when it comes time to get your total number of participants receiving assistance on their water bill, your count for “Water Bill Assistance” will be lower than it should be because some are counted in the “Utility Assistance” total instead. This is more often a problem when there are many options to choose from (I’ve seen databases with over 100 choices for a single item).
  • I would suggest: Either generate a list for program staff that spells out which values should be used for which responses, or remove some of the options/change things so that they cannot be confused anymore (for example, you could change “Utility Assistance” to “Gas/Electric Assistance,” so that program staff know that it doesn’t refer to the water bill). Do keep in mind, though, that no matter what you do to fix this issue going forward, you old data may still have errors caused by the initial confusion, and you would need to be mindful of that when crafting your report.


Hopefully, armed with the above suggestions, you can tackle some of the issues you might be experiencing with your database, and that pesky DATA MONSTER will leave you alone once and for all!





Contracting 101: Accounting for People

Written by CRC on . Posted in Business operations, Contracting

by Kevin Majoros


It was pretty obvious from an early age that I would be working with numbers for a living as an adult. 

By the time I was seven, I could memorize the bowling averages, games bowled, and pin count totals of all 60 members of my mother’s bowling league.  Every week I would sit at a table in the bowling alley with the league stats in front of me while frightening women with bouffants walked by with questions like, “Hey kid, what do I need to bowl this week to raise my average to 170?”  My mother bowled in three leagues weekly and I always had the answers for any questions about the numbers.

My father was the bartender at the same bowling alley and I was allowed to run around and pick up pop bottles for money, although I was not allowed to touch the beer bottles. I used to walk around and calculate how many people would be drinking pop versus beer (bouffant divided by polyester squared) and how many bottles they might drink during the course of the night.  If I thought it was going to be a heavy night of beer drinking, I would pass on picking up the bottles and just concentrate on the bowling stats.

Even in college as I worked my way towards a degree in Finance, I spent more time forecasting my test scores and their affect on my GPA than I did studying.


When I started working at Carson Research Consulting, I was happy to find that most of my responsibilities involved using numbers to answer questions.  Forecasting from trend analysis, time-tracking, building spreadsheets, and just plain number crunching are things I greatly enjoy doing.

When I realized that I would also be involved in contract management, which in effect meant that I would be dealing with people, the questions started immediately in my head:

Is there a template?

Can I create a formula?

Will there be a spreadsheet?

Are there people involved?  Please, not the people…..


The first thing to consider when dealing with contract management is to remember to maintain a logical thought process.  Even though people, unlike numbers, are not always logical, a planned course of action will generate the best results. 

Here are a few tips for managing the hurdles in the contract management process:

  1. Relationships are everything.  Make sure you have as many contacts as possible and maintain good relationships with them.  It is vital to the successful completion of any project.  Once a contract gets into dispute or someone gets a chip on his or her shoulder, the project will most likely suffer as a result.

  2. Contract approvals take twice as long as planned.  In most cases, the consultant is expected to begin their work while the contract is going through the approval process.  Plan for this and have a bank line of credit available to fund the wages of your employees during this period.

  3. Look for guidance from your own staff.  The staff members involved in the contract will have the best feel for how things are progressing during the contract.  Interact with them regularly and get updates on how the work is progressing. 

  4. Track every dollar and labor hour related to the contract.  Use your budget to create a declining balance spreadsheet and track your labor, direct and indirect expenses on a monthly basis.  Sharing this with the staff will help keep the budget in check.

  5. Monitor variations in the scope of work, deliverables and performance measures.  These are established in the contract and generally will vary.  If new work is added during the life of the contract, there needs to be a reduction of work somewhere else to stay within budget.  Kevin5

  6. Know the payment terms of the contract.  Knowing when to bill, whether it is monthly or upon completion of deliverables, is key.  Get that invoice off your desk as soon as possible.

  7. Know the timeline of the contract.  When you are monitoring the time tracking stats, it is important to know where the peaks and valleys are in the timeline of the project.  When there are reports due, surveys being compiled or focus groups being managed, labor hours are going to increase. 

The above tips are just a few of the things that I have focused on because of their importance when mixing people with numbers (yes, the people….). 

It would also be a good idea to consider the words of interpersonal skills guru, Dale Carnegie:

“If you want to gather honey, don’t kick over the beehive.”