Why the Hell was Taj at a Design Conference and What Did She Learn There?

Written by CRC on . Posted in , CRC Team, Dataviz, Design

By: Taj
i15banner     Last week I was at the Interaction Design conference. Now, you probably know that evaluation and design don’t exactly go hand-in-hand, so I understand if your next thought is a befuddled, “Huh?” You don’t usually find evaluators at a design conference.     1-2So, was I just hopping a plane to San Francisco in February because of the awesome weather and the cool town? Well, not exactly, although I have to say those were both nice perks of being there. Did I travel to San Fran for some much needed R&R and to clear my head? Not entirely, but I do always feel inspired and innovative when I’m there. (And it’s a bit surreal to go by the Uber headquarters in your Uber.)             The truth is, I’ve been doing a ton of reading and thinking about design lately. The more I learn, the more I find that there is a common ground between the way we do our work here at CRC and the work of design firms. I’ll be writing a series of blogs in the future about how we use design thinking in our approach to research and evaluation, but for now I’ll just quote Phi Hong Ha, who said in her conference talk, “Design is about helping people to make sense of the world.”   That’s what we evaluators do, too! Help make sense of the world. It’s our job to help people understand what is going on, what their clients are thinking and feeling, and what the reality is of the communities they work in and what the impact is of their programs.     Design thinking, combined with CRC’s love of beautiful data, makes a design conference the perfect place for me to learn new things, embrace old ways of doing things with a new language, and continue to be inspired. As Tim Brown (Tim Brown!) said, “Information is the material we are most using in design today.”   TimBrown     If that doesn’t sound relevant to what we do here at CRC, I don’t know what does.     So, what did I get out of this conference?     dansaffer_1291256779_0Creative innovation is risky. Smaller firms and individuals are often able to innovate, try new things, and “play” more than larger firms because we smaller firms have a higher tolerance for risk and less bureaucracy to fight.One barrier you encounter to creativity, as Dan Saffer pointed out, is that “Efficiency is the enemy of creativity.” Creativity takes time. And it’s not always linear. Researchers love linear—linear relationships, linear regression. Here at CRC we have a certain tolerance for a bit of meandering if it means a better process, a better relationship, and a better outcome.     Things never turn out like you think they will. So stop expecting them to. It’s been our experience that projects never go exactly as intended. That’s because our work involves people, not widgets (or pharmaceuticals). And people change, make decisions, and adapt constantly. Jan Chipchase said, “Once you begin, assume that everything you planned is not relevant.” While this is an extreme perspective, I have found it very helpful to let go of the idea that nothing will change. Being flexible in our thinking has worked for us very well, and it’s a constant in the design world.     Silicon Valley is full of people telling you failure is great. “Fail fast” they say. Foundations especially struggle with this. They often hold onto approaches or initiatives long after they should let go and move on to new and innovative ideas. Everyone struggles with change, individuals and organizations. But Saffer made a good point. “Failure sucks!” he said. Learning from failure is where it’s at. We are constantly trying out new approaches and new tools. Some of them succeed, some of them fail. But I have yet to try out something new that we didn’t learn from, whether it was software, process, or research methods.     Risk     Another theme of design work is empathy. The idea of empathy often makes researchers uncomfortable. We embrace empathy here at CRC. Watching people work with their clients or fill out forms or struggle with databases gives us empathy for how we might develop solutions to make their work easier. Indi Young did a great job of defining empathy in a way that I think even evaluators and researchers can embrace. She differentiated between “emotional empathy” (feeling what someone is feeling) and “cognitive empathy”, which is about understanding how people are thinking and why and how they react.     I would argue that we need some of both because we also need to understand the emotional state people are in when we are asking them to fill out forms, take attendance, complete surveys, etc. Sometimes they are upset, frustrated, or anxious (forms can bring it out in the best of us). But cognitive empathy applies too. How do people see and understand our data collection tools? How are they interpreting the questions we ask on surveys (hello, cognitive interviewing)? What makes them skip over some parts of the forms they fill out? Observing and talking with people about these things specifically can be very fruitful.     dani2Danielle Malik wins the award for best presentation title with “Go Home Data, You’re Drunk.” She talks about how future trends will be all about analytics and customization (even more so than now). Because of the visual storytelling we do here, I was excited to hear “data points are the words, and it’s up to us to construct the sentences.” Her presentation resonated with me, as she focused on the fact that data is not neutral. You have to constantly think critically about where it comes from, why you collect it, and how you intend to use it. And we’ll definitely be looking for ways to use the hashtag from her presentation … #drunkdata.         I think evaluators have much to learn from the design community. Design is user centered, process oriented, and collaborative. The design process requires that you empathize with your users, understand what their problems are, come up with creative solutions, test them, and then build in a process of iteration and tweaking until you get where you need to be.     I think evaluation should be more like design. I will admit that I often have doubts about the way we design and implement evaluations in this field. Evaluators often do not take the users into account. Some evaluators work in virtual isolation from their end-users. How many actually spend time in the schools, public health clinics, and program sites where their data collection takes place (besides us, of course)? How many actually talk to the people filling out the surveys to find out what they think was meant? Karl Fass, professor of User Interface design said, “We shouldn’t be using the vocabulary of natural science,” and I agree. I often question why we use the vocabulary AND the methods of the natural sciences in a contest that is very human-centered and (let’s face it) often chaotic.     So I leave you with these questions….     What would happen if we used human-centered design principles and user testing to develop and evaluate social programs instead of evidence-based practices? [1]     What if we began every program implementation with empathy, planned it collaboratively with the end user in mind, and assumed that some iteration would take place before it got to where it needed to be?     What if we conducted our evaluations not like science experiments but like user analytics?     What if we were careful to collect only what we (or our clients) needed, to constantly review the data and collaboratively make decisions with it, and to not assume there was a beginning or an end point? How revolutionary would that be?     Well, if I’m going to be truthful, we know that good evaluation practices do these things, even though we don’t always like to talk about it. But in my upcoming posts on design thinking, I plan to do just that.   SteveJobs     [1] It’s a radical notion, but it’s one worth considering. Large, elaborate studies are conducted using the principals of the natural sciences. Then we plop those programs into a variety of locations as if context doesn’t matter. Designers know that context always matters. And that users can be almost infinitely segmented.

Secrets from the Data Cave: January 2015

Written by CRC on . Posted in Technology and Customer Service

By: Sarah

 

 Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a sneak peek into the cave, with tips and the latest updates on what we’re implementing here at CRC.

(This month’s SDC is an on-the-road edition!)

 

logo.fb_

 

Late last fall, I had the opportunity to go to phpworld 2014, a conference for PHP code developers, held in Washington DC.

 

Many different web applications use the programming language PHP, so coders have a myriad of options for deploying it (including, but not limited to, Drupal, WordPress, Joomla, and Magneto). This conference is designed to bring these communities together in order for developers to learn about how others are using this language, and get ideas for enhancing existing applications.

 

The conference sessions that fascinated me the most were those about web security. PHP is great for developers because it is a very powerful language, meaning you can do a lot with it. But, without the proper precautions, hackers can exploit that power over your applications and cause huge headaches.

 

SDC_1

 

One conference presenter gave a great example of how this might look in the real world. As context for the example, you need to know that there are certain combinations of coding language that, should a hacker type them into your application and you haven’t protected your database, the application will interpret them as legitimate SQL code and “drop” all the tables in your database. Therefore a hacker could use this technique (called “SQL injection”) with a “drop database table” command – feeding the application some code that masquerades as part of the original developer’s code— to irreversibly erase all of your data.

 

 

With this in mind, consider the presenter’s example: A would-be hacker in Europe rigged up a fake license plate with a bit of SQL injection code on it. He was exploiting the fact that traffic cameras use picture-parsing technology to break down an image of a license plate into individual characters (to record the plate info of speeders).

 

SDC_2

 

So, this was presumably an effort to trick the traffic camera into parsing the code, feeding the data in as it would any license plate. This would have ultimately caused all the stored traffic data on the back-end to be wiped clean via the SQL injection! Now, it’s unlikely that this actually worked in practice. But, it did get me thinking about the lengths hackers will go to in order to mess with your data.

 

  All in all, it was an excellent conference, and I feel I learned a lot about this powerful coding language. I’m now looking forward to connecting with more PHP developers in future!

Data Visualization Predictions for 2015

Written by Sheila Matano on . Posted in

By: Taj and Sarah
Technology is always changing and evolving, and data visualization technology is no exception. While the term “infographic” was once unfamiliar to most people, fun visualizations are now regularly spotted on Facebook and Twitter feeds for both individuals and businesses.

But just how much can we expect to see change within a year? In the spirit of the New Year, we are posting a few predictions on where we think data viz is headed in 2015 and beyond:

 

Microsoft Office: it can do a lot more than you think

With all the integrated apps and changes in Office 2013, you can really accomplish some amazing visualizations using nothing more than your trusty old Microsoft Office software. But the fact is that a lot of people don’t use it to its full potential, and don’t realize what it can do.

We predict that in 2015, more people will figure out that Office can be used to create some really interesting visuals, like the chart of weather patterns below (an Excel template):   weather or this dashboard we created:

Picture1

 

No more giant reports sitting untouched in file cabinets

Or, at least, fewer of them: in 2015 we predict a small but perceptible decline in the popularity of the traditional final paper report for program evaluation (you know, that one you meant to read, but never actually got around to, and then you lost it in your desk drawer?) With the rising popularity of infographics, dashboards, and even graphic memos to communicate findings, people will begin to gravitate towards these options for reporting program outcomes, as they are sometimes a better fit.

 

Less is more

Data viz gurus like Edward Tufte have been touting this advice for a long time, but it is our hope and prediction that more people this year will come to see that less is more when it comes to visualizing data using charts and graphs. Fewer colors, borders, and dancing bears can, in fact, make your data viz better (see this concept come to life at http://i.imgur.com/WntrM6p.gif).   Have a prediction you’d like to share? Leave it in the comments below!    

CRC’s 2014 In Review

Written by CRC on . Posted in Business operations, CRC Team, Dataviz

Warm Winter Tidings from all of us  at CRC!

Warm Winter Tidings from all of us at CRC!

   

 

We’ve had an extraordinarily busy year, and are so pleased to have shared much of it with you!  

 

This month, rather than a blog with evaluation-related info and tips, we’re once again ending our work year together by taking looks back and forward.    

 

CRC-Logo newAlong with continuing to provide high quality evaluation and data services to clients — both existing and new for 2014 — Carson Research has completed its long-term project of updating our branding and communications so that they better reflect what we do and who we serve. Heading into 2015, we hope you’ll become used to hearing from us as simply “CRC”, in addition to calling us that to save breath and/or typing time.    

 

 

To quickly catch you up on our 2014, our resident data viz department, held down by Ashley, beautifully summarized our year’s activities:

 

CRC Holiday Card Inside Front Cover    

 

But, as is our custom, we also asked all the CRCers to say a little (note: our definitions of “little” vary) something about their year. Below, our staff reflects on things they’ve experienced and learned from in 2014 and things they’re looking forward to in 2015:

 

  • Ashley, Database Analyst
This year I learned that infographics and dashboards are AWESOME! Utilizing fun icons, cool graphs, and eye-catching colors to tell data stories for our clients has been so much fun. I am looking forward to exploring more resources for creating them!
  • Dana, Research Assistant
I assisted with the implementation of a community survey and, having never done something like that before, I was able to learn a lot of new skills. Planning, implementing and analyzing the results of the surveys within a 2-3 month span was hectic to say the least, but the experience helped me gain a lot of valuable tools and resources. I also took a qualitative analysis course at The Evaluator’s institute in DC and an AEA online webinar on survey methods and sampling, which helped me learn new tips and tricks that I can incorporate into my current and future work. I also had the opportunity to learn some Arc GIS mapping skills and self-taught myself some rudimentary VBA coding skills on excel as I worked on a proposal database for CRC. I’m looking forward to working on new projects and learning new things from them, as well as utilizing some of the skills I have learned over the course of this year, such as mapping.  

 

CRC's annual holiday lunch for staff. This year at The Food Market (http://www.thefoodmarketbaltimore.com).

CRC’s annual holiday lunch for staff. This year at The Food Market (http://www.thefoodmarketbaltimore.com).

 
  • Jill, Research Analyst
In 2014 I was greatly relieved and proud to officially complete my PhD program at UMBC! My favorite learning opportunity of the year was attending Museum Camp at Santa Cruz Museum of Art and History. It provided me with inspiring and invigorating ideas about how to make evaluation more user-friendly, engaging, and even fun for everyone involved! Next year I hope to apply more of what I learned at camp, attend other, similarly enlightening conferences, as well as to continue honing my visualization skills.
  • Kevin, Controller
I tore my ACL in July throwing discus and had to drop out of track & field at the 2014 Gay Games. I postponed surgery and went on to win 3 gold and a bronze in my main sport of swimming at the Games. I am now rehabbing after cadaveric ACL replacement & meniscus surgery. I still have many months of PT ahead of me but have registered for the 2015 EuroGames in Stockholm, Sweden to be held in August. It’s on!
  • Leslie, Research Analyst
This fall, I began a second master’s degree in Educational Psychology. It’s been challenging fitting studying into my schedule and the online format of the program is a new experience for me, but I enjoy being back in school. I’m looking forward to the classes I’ll take next semester in educational measurement and assessment development.
  • Mandi, Research Assistant
What I learned this year: (1) Giving presentations isn’t so scary after all – even at national conferences. (This year I attended AEA 2014 as a first time presenter); (2) Focus groups with can be both fun and informative. (I performed a number of child focus groups this past year and found there were certain activities that kept kids more engaged during the session than others. Stay tuned for an upcoming CRC blog for details!). What I’m looking forward to next year: (1) More focus groups (I’m looking forward to conducting more focus groups next year, refining my interviewing skills, and finding different focus group activities that are both fun and engaging.); (2) Data visualization (I plan to incorporate more data visualization into the work I do. Including qualitative data visualization tools and techniques; (3) Attending conferences (I enjoy the workshops, networking with other evaluators, and traveling, especially to places I’ve never been to before).  

 

Opening our "Secret Krampus" gifts!

Opening our “Secret Krampus” gifts!

 
  • Matthew, GIS Analyst
This year I began learning how to code and to use that code to make more dynamic and customized maps. Next year I’m looking forward to expanding GIS skills and coding to grow the mapping services division here at CRC.
  • Sarah, Technology & Information Specialist
The coolest thing that I learned this year was how to program in PHP. It’s an incredibly powerful programming language! And next year, I am looking forward to launching our own online database application, using PHP and some other fun things!
  • Sheila, Research Analyst
The coolest thing I did this year was travel. At EERS, it was great to see old friends and hear Dr. Rog’s talk on infusing evaluation theory with practice. In New Orleans, hearing six surgeons general speak at APHA about their challenges, accomplishments and vision for the future was inspirational. In Detroit, I got to learn how Shinola makes their watches; it was also great to visit D3 and see how the organization works to provide communities with to high-quality data. In Minneapolis, I learned about different data viz techniques at EYEO and visited art installations at Northern Spark. In New York, I attended the launch of Data & Society and met great folks who are researching the social, cultural, and ethical issues arising from data-centric tech development. At the CIC impact summit in DC, I heard how various tools can be used to tell more meaningful, accurate, and connected stories that can improve community outcomes. All were great experiences, but watching world renowned rugby players compete  for the World Series Title at Sam Boyd Stadium was, by far, the most awesome thing I did this year. What’s next? In 2015 I’m looking forward to perfecting my chocolate mug cake recipe, learning more about data viz and improving my mapping skills. I’d also like to tackle learning a new language on duolingo.
  • Taj, CEO
The big game changer for me was attending the EYEO Festival in Minneapolis in June. EYEO brings together data geeks, coders, designers and artists for some of the most fascinating presentations and conversations I have ever experienced at a conference. It is THE data visualization gathering, in my opinion. I was so inspired by my experiences there, that I chose to take the big leap to go back to school. I’m now back in grad school, earning (another) Master’s Degree, this time in Information Visualization at MICA. Becoming a beginner again is very humbling, and I’m excited to see how what I learn in this program can help CRC go so much farther in our efforts to help our clients to visualize and understand their data.
  • Tracy, Research Associate
My biggest learning experience in 2014 was designing and implementing a community survey in two Baltimore City neighborhoods. In the coming year, I anticipate opportunities to work with new clients and expand my evaluation skill set.    

 

Krampus aftermath.

Krampus aftermath.

   

 

Happy New Year, and here’s to new learning, growth, and gainful opportunities… as well as more beautiful data… for you and yours in 2015!    

 

AEA 2014 Recap

Written by CRC on . Posted in , CRC Team

by Mandolin Singleton visionary_eval

 

Last month I attended the 2014 American Evaluation Association conference in Denver, CO. 2014’s conference theme was “Visionary Evaluation for a Sustainable, Equitable Future.” The event brought together research and evaluation professionals from all over the globe and from a variety of disciplines (e.g., community psychology, health and human services, PreK-12 educational evaluation). Attendees were encouraged to explore ways in which evaluation could be used to support sustainability and equality across disciplines and sectors.

 

kaleidoscope

 

This year’s conference was especially exciting (as well as nerve-wrecking) for me because I was attending as a first time conference presenter. I went to numerous sessions, learned a lot, and had a great time connecting with other evaluators. (I even found a little bit of time to explore Denver’s spectacular shopping scene). Below are some of my highlights from the conference.

 

    Robert Kahle: Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful wallflowersFocus Groups
Robert Kahle is a sociologist and expert in qualitative research. He is versed in leading skill building sessions for both new and experienced focus group moderators. In this workshop, he talked about how to effectively manage focus group dynamics, identified problem behaviors typically observed in groups, and reviewed strategies to recognize, prevent, and address them. I found this workshop especially informative and will be applying some of these techniques in my upcoming focus groups. If you’re interested in learning more about what was reviewed in this session, much of the content can be found in Robert’s book, “Dominators, Cynics, and Wallflowers: Practical Strategies for Moderating Meaningful Focus Groups”.

 

 

 

 

    Veena Pankaj: Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results
Veena Pankaj has experience directing evaluation design and implementation with a focus on participatory approaches (you can check out one of her recent publications on participatory analysis here). Pankaj described a data visualization technique (Data Placemats) she uses to engage, improve understanding, and solicit stakeholder interpretation of evaluation results. She talked about the logistics of the technique (e.g., the what, when, and how) and reviewed the learning journey involved in their creation. If you’re interested in Veena’s work, you can find slides and resources from the session on SlideShare.

 

My Presentations

 

Deciding to one-up myself by giving not only one, but two presentations my first go around didn’t really help my nerves, but what can I say, I was enthralled by this year’s conference theme (you could probably also say there was a little part of me trying to impress the boss as well). I gave a poster presentation on an infographic that we (CRC) created for Elev8 Baltimore in effort to visually display evaluation findings. The poster reviewed the process, results, and implications of translating data findings into a reader-friendly infographic.

 

Poster session reception

Poster session reception

 

My poster implied that infographics can be successfully translated into attractive, functional, and informative infographics. It suggested the use of infographics to report evaluation findings, as effective data visualization can attract readers, aid in interpretation of data, and support comprehension. Perhaps most importantly, it implied infographics can be used to promote the use of evaluation findings to inform decision making!

 

Me and my poster!

Me and my poster!

 

I also gave a paper presentation on our (CRC’s) establishment of an early warning system at the Elev8 Baltimore sites, and reviewed the application of early warning indicators to the middle grades. Within the presentation I gave a brief description of the system and reviewed the steps we took to create it – including everything from gaining access to the data to producing the final reports. In this presentation I described how early warning indicators can enhance the accessibility and use of evaluation data; through providing reports to sites on a quarterly basis, the early warning indicators can be used on a real-time basis to inform programmatic decisions. Implications were also made for expanding the use of early warning indicators from high school to middle school populations.

 

Between the multitude of great sessions I attended, learning loads of information, giving two presentations, and still finding time to scope out the shopping scene in Denver, you could say I had a whirlwind of a time at AEA 2014. For more pictures from AEA 2014, visit AEA’s Facebook page.

Secrets from the Data Cave: November 2014

Written by CRC on . Posted in Dataviz

by Sarah McCruden

 Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

Visualizing Nonprofit Data: Tell the Real Story by Using Your Program Knowledge

  (This blog post originally appeared last month as a guest post for the Maryland Association of Nonprofit Organizations.)
  Many nonprofit organizations rely on in-house staff members to crunch numbers and create reports for their program data. This means that, in some cases, those who are inexperienced at turning heaps of data into charts and graphs will find themselves stuck with a daunting task: creating meaningful data visualizations for their program. Now, there are a few basic tenets of data visualization to which everyone should adhere. For example, all pieces of a pie chart need to add up to 100%. But let’s say you know the basics already. What else might you want to keep in mind when choosing a visualization for your data?   My top suggestion: never forget that YOUR knowledge of your program and the populations you serve can make all the difference when it comes to presenting your results. So, if you think your data visualizations are not showcasing your program results in the way you expected, try to determine why they might look this way.   There’s more than one way to visualize data.   Let’s look at a fictional program as an example: say I am reporting on outcomes a program serving adults struggling with addiction. I want to show that clients who participate this program are not only more likely to start drug & alcohol treatment, but also to successfully complete the treatment program as prescribed, than those in a control group of equal size (who did not participate in the program).When I look at the results, however, I am disheartened. It appears as though there is very little difference between the two groups in terms of who is more likely to complete treatment:     Yet anecdotal evidence from my fictional program (talking to program participants and non-participants, etc.), tells me that those in the program do indeed seem more likely to overcome their addictions, stay in treatment, and stay clean. So what went wrong?Well here’s something I had not considered: what if most of the addicts concerned in both groups cited heroin as their drug of choice. In this case, their prescribed drug treatment would likely be an opioid maintenance program, i.e. methadone maintenance therapy (MMT), as this is commonly used in the treatment of opioid dependence. MMT is a treatment that can go on for years, in some cases. So the individuals on MMT would not technically be considered to have “Successfully Completed Treatment,” since they are not yet finished with treatment, but many have abstained from drug use for the entire time and have reformed their lives such that had they not been on MMT, they would be considered “Successfully Completed.”   To remedy this, I use the same pie charts—but this time, I give a breakdown of the portion of clients that did not successfully complete treatment to show those who are still in MMT:   Data Visualization 1   This visualization shows that while the percentage that technically completed treatment is almost the same for the two groups, a large portion of program participants who did not complete treatment are still on Opioid Maintenance. In the non-participant group, the vast majority of those who did not complete treatment had “other” reasons for this (like dropping out of treatment early). This gives a completely different perspective on the same data, and shows just how powerful your choice of visualization can be. While both examples are accurate representations of the data, one is simply more effective at showing the program results.  

So remember, YOU know YOUR program better than anyone else does. Use that knowledge to choose a visual representation that shows the world what your program has achieved!

Secrets from the Data Cave, October 2014

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

(370-365)_Ghost_fight_(6326423113)

Access vs. Excel: Which Will Reign Supreme (for your storage needs)

Access and its less showy cousin, Excel, are both good options for data storage. In this installment of Secrets from the Data Cave, I’ll highlight some things to consider when deciding between using Excel and Access for your data storage needs.

IBM.CardComputing.19xx.102645452.lgI should start by saying that Excel CAN do a lot of things that Access does. For example, I’ve built some incredible automatic scoring programs in Excel. However, in some cases the use of Excel may prove (much) more time consuming because of all the formulas and manipulations one would have to use to get the same results that Access would very quickly get to.

Then again, if you do not know how to use Access already, it can be intimidating to learn a new program. If that’s you, then I would encourage you to check out this free, 12-part tutorial on Youtube to learn the basics of Access. You can find the first video here.

   

A brief rundown of key considerations

EXCEL works well for:

  • Computing aggregate totals from a single, flat data source (an example of this would be answers given to a single questionnaire or a table of demographic info)
  • Computing totals and/or organizing information where you have a single common identifier on everything (e.g., if you have a bunch of forms completed by clients, but every form has the client’s driver’s license number on it)

Please note that even in the above cases, you’ll need a pretty firm grasp of how to write Excel functions in order to get the results you may need.

ACCESS works well for:

  • Integrating many different data sources relating to one central population (e.g, many different datasets that all give information about the clients your program serves)
  • Computing totals and organizing information when you have many different identifiers (such as if some of your forms have clients’ driver’s license numbers on them, but other forms just have the last 4 digits of their SSN, you need a relational database that can match up your client list to any and all IDs that correspond to particular people)
us__en_us__ibm100__social_security__chandler_building__591x470  

And when in doubt, feel free to ask! Drop me a line at sarah@carsonresearch.com with some details about your data and I’d be happy to suggest whether I think you’d be better off using Excel or Access for your project.

photo 1

What I did on my summer vacation

Written by CRC on . Posted in

by Jill Scheibler  
photo 1

Hint: It involved both literal and metaphorical roller coasters.

Today— with suntans fading and schools back in full swing— it’s a few days into fall and I can definitely feel it! It’s gloomy outside and I’d like nothing more than to revisit my summer vacation. In my role at CRC I wear a number of different “evaluation hats”, and otherwise keep busy throughout the year teaching courses at a local university and directing a small arts nonprofit called Make Studio. When summer rolls around I am very eager to escape my not-quite-9-definitely-later-than-5 schedule for some fresh air and sunshine. Yet I don’t necessarily want to shut off my brain or escape the things that excite me about my work. So, this year I went to summer camp for nerds. By that I mean I was beyond thrilled to be accepted to and attend an event called MuseumCamp. Per the event’s leader, the innovative and fearless Nina Simon, author of Museum 2.0 and The Participatory Museum:
 “MuseumCamp is an annual professional  development event at the Santa Cruz Museum of Art & History in which teams of diverse, creative people work on quick and dirty projects on a big theme. This year, the theme was social impact assessment, or measuring the immeasurable. We worked closely with Fractured Atlas to produce MuseumCamp, which brought together 100 campers and 8 experienced counselors to do 20 research projects in ~48 hours around Santa Cruz.”

photo 4

  In this second year of MuseumCamp, the event brought together evaluators (like me) and arts professionals (also like me), as well as artists, performing arts group staffers, museum professionals from museums large and small, and more! Attendees came from throughout the U.S., and from as far afield as Sweden and Wales.
Here we all are in our SC-appropriate, official camp shades.

Here we all are in our MC official shades.

  At the end of camp Day 1, in which campers got acquainted with one another and we participated in informative workshops from the likes of Ian David Moss of Fractured Atlas and Barbara Schaffer Bacon from Americans for the Arts, 100 campers were broken down into 20 teams. These teams were charged (well, more accurately selected, via a demented white elephant process) with designing a research project to measure the impact one of dozens of arts and culture events happening during their stay in Santa Cruz. My team, who later became known as the “JerBears”, studied social processes at a Jerry Garcia tribute concert. (Conveniently for us, it was Jerry’s birthday at the time; inconveniently for me, I’m not a fan of the Grateful Dead.)
Here I am, on the left, with two teammates. I think I'm hiding my lack of enthusiasm for jam bands quite well! (Read more about our project here: http://camp.santacruzmah.org/jerbears-at-the-shakedown/)

Me (left) with 2 teammates. I think I’m hiding my lack of enthusiasm for jam bands quite well! (Read about our project: http://camp.santacruzmah.org/jerbears-at-the-shakedown/)

  Other teams conducted studies involving a wide variety of local events and sites, including SC’s First Friday art walk, the famous “Steamer’s Lane” surf area, a mental health agency art exhibit, and a hard rock show at the boardwalk. Summaries and photos of all the projects can be viewed here.   Planning and executing on our projects was challenging at time, but in-between work sessions we were lucky to enjoy views such as these…. SC_strip

 [Side note: If you’re a child of the 80’s such as myself, you know that Santa Cruz is more than just a scenic if a bit grungy beach town, it’s also where this cinematic treasure was filmed.]

  So, aside from the excitement generated by enjoying copious amounts of sunshine, late 20th century nostalgia, and sea lion viewing, how did coming together in Santa Cruz allow campers to overcome their entrenched ideas about “research” and “measurement” in order to actually complete 20 research projects in 48 hours? Once again, per Nina S.:
“We encouraged teams to think like artists, not researchers. To be speculative. To be playful. To be creative. The goal was to explore new ways to measure “immeasurable” social outcomes like connectedness, pride, and civic action.”      
10455427_10205060805178895_1551453421564443668_n

Tasked with the tricky task of convincing unwitting study participants to wear “JG party favors”.

 For me, thinking as an evaluator, arts program director, and occasional academic, I was truly impressed by what all of the groups were able to accomplish within such steep time constraints, limited resources, and unusual circumstances.

Although frequently fun and even silly, camp was often grueling mentally and physically as we tried to pair lightning quick learning with professional networking and (literal) construction of research tools.

After being a JerBear, I particularly took away the following things from camp:

1. In determining indicators and proxies for complex phenomena, don’t be afraid to wander a bit. It was really helpful for our group to “get lost in the weeds” even though it felt frustrating at times. We started by thinking too big, then focusing down too fast, blew things up again after that, and then finally re-focused in to arrive at a realistic, but worthwhile, set of indicators.

2. Remember that there can be is tremendous value in being unconventional and even weird in your tactics, particularly to connect with audiences and even research participants. (See Nina’s find re. “getting weird” here.)

 3. Stay  open-minded about problematic or limited results – they can still be revealing about some part of your research question or informative for designing better efforts that get you closer to where you’d like to be.

4. Trust in your ability to work productively with an eclectic team— sharing goals and interests but absolutely no history can be a good thing and creates openings and (controlled) conflicts to stimulate new approaches. I will definitely seek out more opportunities like this in my daily work!

   

MuseumCamp 2014 officially closed with an all-hands debrief about all of our projects, facilitated by Alan Brown of WolfBrown, an evaluator experienced in measuring social impacts of the arts.

Alan Brown also really knew how to rock a sombrero.

Alan Brown really rocked a sombrero.

 

But it was not really over until a rousing sing-a-long of “We Are the Champions”.

The JerBears... an unlikely but harmonious combination of an evaluator, two museum staffers, me, and a theater company folk. (Guess who was the biggest JG fan!)

JerBears… an unlikely combination of an evaluator, two museum staffers, me, and a theater company folk. (Guess who was the biggest JG fan!)

  And that is why, if for no other reason, you’ll find me in Santa Cruz at summer camp for nerds again next year!  

I’m a bit late to this summer camp recap party! Please take a few minutes to read some of these posts by fellow campers, which are far more eloquent:
     

Baltimore Data and Evaluation Meetup

Written by Taj Carson on . Posted in

 

eval3 

The Baltimore Data and Evaluation Meetup, recently created by CRC, is a group for people working at nonprofits, foundations, and government agencies who are interested in collecting and using data to improve their programs. Whether you are trying to figure out where to start, wrestling with providing data to funders, figuring out what outcomes you should be measuring, or analyzing and reporting on the data you already collect, this meetup is for you. All levels of expertise are welcome. Participants will discuss the issues they are facing and share ideas and resources in order to practically solve problems. At least one skilled evaluation professional will be present at every session. We are hoping to develop a group where people from different organizations can bring questions about evaluation, share ideas, and build a community around data collection, outcome measurement, and reporting.

Please join us for our first meeting on Wednesday, September 3, 2014 from 8:00 to 9:00 AM at Maryland Nonprofits, 1500 Union Avenue, Suite 2500, Baltimore, MD. We plan to have a discussion about potential topics for the rest of this year’s meetups. A light breakfast will be served.

Don’t forget to RSVP here.

For more information, please email sheila@carsonresearch.com or connect with us through our social media channels on Twitter and Facebook .

 

CRC’s “Dumbphone” User

Written by CRC on . Posted in

by Tracy Dusablon

8395672030_96c060c657_z

Each CRC staff person is assigned a month in which to write a blog – this month it was my turn.

At first, I was wracking my brain to come up with something instructive, like in my colleague Sarah’s series Secrets from the Data Cave, or hip like Sheila’s post about Data Driven Detroit.

Instead, I decided to write about something  that sets me apart from my co-workers, and tell a little story about our office in the process.

 

The other day I was checking office voicemail online (we have an internet phone system) and came across this funny-looking icon. I noted how non-self-explanatory this icon was and, out of curiosity, asked a few co-workers if they knew what it meant. The conversation went a little something like this:

Me: “Hey, does anyone know what this ridiculous icon that looks like 110 camera film is?”

old film

Co-worker #1: “Seriously?……..That’s the international voicemail symbol – it’s been around for decades.” [Note: co-worker #1 is in her early 20’s]

Me: “Decades……. really?”

Co-worker #2: “Haha, co-worker #1, you haven’t even been around for decades! But yes, that is the voicemail icon.”

 Co-worker #1: “Well, it’s the only voicemail symbol I’ve seen in my entire life. It’s on EVERY cell phone”

 Me: “Well, it isn’t on mine. When I have a voicemail on my cell, it looks like a phone handset.”

 Co-worker #2: “No way, you have it – you just don’t know.”

 Me: “Call me and leave a voicemail…I’ll prove it to you.”

This conversation continued. Co-worker #1 called and left a message on my phone. Sure enough – no 110 film icon appeared; just an old-school phone handset (much to everyone’s shock and amusement). Another co-worker chimed in this time, looking over her shoulder and brushing away tears of laughter from her eyes……“OMG, I had your phone in like 1996!”

Admittedly, I’m the technology dinosaur in our office. I’m in my late 30’s and the proud owner of a “dumbphone”. I also stay away from Facebook, Twitter, Instagram, and all the other social media that I know nothing about. I’ve faced my fair share of ridicule in the office because of this, but the voicemail conversation took the cake.

I’ve been asked, “In our work environment, how can you NOT have a smartphone?” My answer, most simply is….“I don’t want one. Well, not yet.” It’s not that getting a smartphone has not crossed my mind, but I’m ambivalent about it and I never considered owning one until I started working here over a year ago. So at this point, I’m weighing the pros and cons. Here is my list so far:

Pros

  • Email access anywhere, anytime.
  • Internet access anywhere, anytime
  • Capturing video and still photos of local Hampdenites sparring outside our office windows
  • Apps claiming to organize and simplify my life

Consphone screen

  • Email access anywhere, anytime
  • Cost
  • Learning curve
  • Having a phone dictate my life
  • Auto correct (I have nightmares about sending inappropriate emails to clients. With my luck, I’d have the next contribution to the website Damn You Auto Correct)


 

The cons still outweigh the pros for me right now; I’m just not ready for a smart phone quite yet. Plus, have you read the article recently published in Science[1] about people who would rather shock themselves than be without their phones or other devices? I’m not itching to jump on that bandwagon!

Anyway, I like to think I make out just fine without a smartphone. I’ve never missed a meeting, I meet my deadlines, and have a means for getting in touch with people and for people to get in touch with me. Everyone has their own style. Mine just might be a bit more old-school than others. I mean, really, I DO text!

 

______________ 

(1) Source article: Wilson, T.D. et. Al., Science 4 July 2014: Vol. 345 no. 6192 pp. 75-77. 

 

 

 

CRC_logo_footer

twitter_social_mediafacebook_social_mediayoutube_social_media
google_social_media