Secrets from the Data Cave: March 2014

Written by CRC on . Posted in , Dataviz

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

March 2014: I Am Going to S P A C E

1280px-Galaxy_history_revealed_by_the_Hubble_Space_Telescope_(GOODS-ERS2)

Earlier this month, while looking for some new and interesting data visualizations, I came across this nifty website that gives a spatial representation of the distance between planets in our solar system1. After thoroughly enjoying the learning experience (along with the witty interjections, as I patiently scrolled though the empty space signifying millions of miles), I visited the creator’s blog page, where he explained his motivation to undertake this project:

“I kept trying to describe the distance using metaphors like ‘if the earth was the size of a golf ball, then Mars would be across the soccer field’ etc., but I realized I didn’t really know much about these distances, besides the fact that they were really large and hard to understand. Pictures in books, planetarium models, even telescopes are pretty misleading when it comes to judging just how big the universe can be. Are we doing ourselves a disservice by ignoring all the emptiness?” 2, (emphasis mine)

It got me thinking: we put a premium on space use in data visualization when it comes to things like BI dashboards, infographics, or even paper reports. We typically want our data spread across as few pages as possible, so that we can process related information simply by shifting our gaze, as opposed to shuffling/scrolling through multiple pages. We might also want to see the same data represented as numbers in a table and as a chart or graph, so it makes sense to try and squeeze multiple representations of the same data onto one page.

But the question remains: as convenient as it is to see everything at once without scrolling across a screen, what do we lose in impact when we choose to condense large datasets into small visual representations? Even when we keep those representations relatively true to scale, how much is lost?

Take, as another example, epidemiological data on different countries’ AIDS-related deaths in 2001 and 2011 (UNAIDS3). This dataset has been condensed to include a few example countries:

aids_datatable

 

aids_datachart 

 

The graph generally conveys the message that AIDS-related deaths in Kenya have dropped substantially in the last 10 years, while the number of such deaths remained about the same in the USA and United Kingdom, and that Kenya has many more deaths, overall, than the other two countries. But it’s still very difficult to wrap one’s mind around the magnitude of so many deaths just from the chart above.

For that reason, I created an alternative visualization of Kenya’s 130,000 estimated AIDS deaths in 2001 that, like the outer space example, involves a lot of scrolling. You can find it as a PDF here.

Now, I live in the real world. I get that we all need to consider limitations on space when creating data visualizations, especially when they will be printed. (Save some trees!) I also recognize that comparing sizes using the scrolling model is difficult because we can only hold so much precise measurement in our memory once it is out of our line of sight. This is why I only included data for one country and one year, rather than comparing all three in one visualization. But, my point remains that condensing as a visualization strategy does not always make comprehending data easier, especially when very large numbers are involved. There is something to be said for a visual representation of data that takes up a lot of space, when the data it represents is “larger than life.” And for that reason, going forward I plan to space my data visualizations out as much as I can, whenever possible and applicable.

Do any of our readers have any input on this topic? Leave a comment and let us know!

Sources:

  1. http://joshworth.com/dev/pixelspace/pixelspace_solarsystem.html
  2. http://www.joshworth.com/a-tediously-accurate-map-of-the-solar-system/
  3. http://www.unaids.org/en/media/unaids/contentassets/documents/epidemiology/2012/gr2012/20121120_UNAIDS_Global_Report_2012_with_annexes_en.pdf

 

 

It’s not another cat video but it’s just as cool – check out the new CRC Website

Written by Taj Carson on . Posted in CRC Team, Dataviz

 

The completely revamped Carson Research Consulting web site showcases the talents and experiences of the CRC team as well as the company’s expanded list of services. These offerings go beyond traditional research and evaluation services to include database wrangling and data visualization.  

 Services are managed and delivered by experienced data and technology nerds, researchers and evaluators. The group’s strong work ethic and resourceful detective skills allow them to collect, organize, analyze and report on the data their clients need to explore, explain and improve their programs. And the new web site showcases the backgrounds, skills and experiences of each team member, including the new chief wellness officer.

 “The web site,” says Taj Carson, CEO of CRC, “reflects our approachable and highly applicable way of doing evaluations and visualizing data. I am excited,” she adds, “to show clients how we can present their data in a way that is more engaging and effective.”

 For many of CRC’s clients, managing their databases was a daily struggle, and several expressed concerns about the program data they put into a database but had difficulty pulling out of it. This was especially true when it came to aggregating data that would yield meaningful results. To address these issues, CRC began offering a full range of database wrangling services for their clients, from the first steps of data collection to the end result of a finished report.

 The company’s database experience includes many popular software options, both online (such as Social Solutions’ ETO) and on desktop platforms (like Microsoft Access). This expertise allows CRC to teach their clients how to successful navigate the data retrieval and aggregation process. Or, if a client prefers, CRC can manage their entire database and reports process, leaving their program managers to focus their full attention on their participants.

 Data Visualization is another exciting new area for CRC, with clients working with them to develop interactive dashboards and infographics as well as web-based maps. Recently, CRC created an infographic for Moveable Feast, a non-profit in Maryland that prepares and delivers nutritious meals to homebound residents at no cost. The infographic showcases Moveable Feast’s upcoming 2014 Ride for the Feast, one of their largest fundraising events.

 “This interactive graphic is exactly what we needed,” says Mellisa Colimore, Moveable Feast’s event manager. “And it lets us easily, clearly and creatively deliver the information for this year’s ride while describing just how extensive Moveable Feast’s services are.”

 Another recent data visualization project was the creation of an interactive map of Baltimore City community schools for the Family League of Baltimore City. This map allows users to visualize a range of information, such as whether or not a particular school is slated for construction or if a school has an on-site health clinic. Users can also view the schools as an overlay with relevant community indicators, such as teen pregnancy and childhood lead exposure.

 The clients served by CRC — nonprofits, government agencies and foundations — are passionate about the human services work they do, and they work with CRC because they want someone else to deal with the mounds of resulting data. These organizations see the value of partnering with a team of data-devotees who can help them understand the impact of their programs as well as how to improve them. 

Secrets from the Data Cave: February 2014

Written by CRC on . Posted in

by Ashley Faherty

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

February 2014: Predicting Student Achievement to Hire the Best Teacher, Faster

Teacher at Chalkboard

The hiring process in any field can be an arduous task. First, HR has to sift through many (sometimes hundreds, or even thousands) of resumes. Then, they take those that are best qualified and pass them along to the hiring manager. That person has to go through the resumes, as well, and contact the desired individuals. Afterwards, interviews, skill testing, and other steps ensue. And when hiring teachers, you have the not insignificant, added pressure of finding someone who can positively shape the minds of the future leaders of tomorrow. (Whoa.)

In the tricky case of teacher hiring, what if there could be an initial screening that would be completed with applicants even before human resources logs those many hours skimming resumes? Sure, there are teacher selection tools already in use, such as the Haberman Star Teacher Pre-Screener and Gallup’s Teacher Insight. But, some argue that they don’t hit an important outcome of teaching— levels of student achievement. 

Hanover Research and TeacherMatch feel they’ve bridged this gap in the teacher selection process through the creation of Paragon K12 and Educators Professional Inventory, respectively. Both of these evaluation tools are presented as web-based software that is very easy to learn and use.  So how do they work? A candidate submits an application and resume for a teaching job and is then directed to whichever program the district has chosen to utilize. The candidate completes the assessment online, then the software examines this data via a large-scale meta-analysis on hundreds of thousands of variables, all of which have been found to be correlated with student achievement. Paragon K12 identifies important teacher qualifications and characteristics such as experience, education, credential pathways, attitudes, attributes, self-efficacy, and cognitive ability. EPI also focuses on these items, with the addition of teaching skills (or knowledge of teaching methods that bolster academic learning).  Then, all applicants are given a score or ranked according to who would have the largest impact on student achievement. Hiring managers can view the results immediately via a customized dashboard, and can see more in-depth information about each individual applicant if desired. They can then move on to the next step of hiring – interviewing those who scored or ranked the highest.

A cool feature of the TeacherMatch software is that it points out which knowledge and skills are most important to your school or district based on the candidates you have chosen to hire in this past. This then allows the software to do its job even better in the future, as you can instruct it to put more weight on those areas so that it will increase the effectiveness of its predictions. It also provides a Professional Development Profile (PDP) to the candidates who are offered positions. This profile assists them with steps they can take to become better teachers, such as suggestions on acquiring more knowledge in certain areas.

As with many topics in education, these programs could be controversial because there will undoubtedly be people who feel that they place too much emphasis on test scores. But, consider the table below.

 

Source: http://nctaf.org/wp-content/uploads/no-dream-denied_summary_report.pdf

Source: http://nctaf.org/wp-content/uploads/no-dream-denied_summary_report.pdf

In light of the fact that 46% of teachers leave teaching after only 5 years of experience in the field, any tool that helps to ensure the hiring of quality teachers, and can even help them to improve, should at least be considered for use in assisting the hiring process along with other methods of schools’ and districts’ choice.

 

Our Valentines: Why we love evaluation at CRC

Written by CRC on . Posted in CRC Team

compiled by Sheila Matano

papercut_hearts

Earlier this year, the American Evaluation Association released a statement describing what evaluation is and its value to evaluators and the public at-large. There are many different types of evaluation and the time it takes to complete an evaluation project can vary from a few weeks to several years. Amidst a flurry of projects, debacles, deadlines and what seems to be a long and arduous winter, the CRC team took some time this week to decompress and reflect on what we love about our work. As evaluators, we sometimes get bogged down by minutiae and forget the value of the work that we do! Taking a step back to reflect on our work and experiences encourages insight, learning, and fosters growth.

So in honor of Valentine’s week, we want to share why we love being evaluators and the work we do here at CRC.

(In the spirit of the season, sprinkled throughout are images from earlier this week, when CRCers Leslie, Sheila, and Jill celebrated the holiday in our neighborhood, through crafting, at “Love Notes”. This event was co-hosted by great local entities TrohvMake Tribe, and Baltimore Paper Cuts, with refreshments by Kinderhook Snacks! Check them out!)

Leslie:  I love working with people who are passionate about the work they do and I enjoy being able to help them improve upon it. I love that every project is different and that I get to learn about new topics all the time.

Kevin:  I love working at CRC because I get to crunch and forecast numbers, and build spreadsheets all day. I also enjoy finding ways to defeat the obstacles in the life of a contract.

Tracy:  I love the fact that all of the projects I work on here at CRC are based locally. We work directly with our clients to affect change in the Baltimore community – a place where we all live and work. It is a rewarding feeling and a nice change of pace from my previous work on a national evaluation project where I felt so far removed from my work.

trohv5

Ashley:  I love seeing how our finished products such as reports or infographics can take tons of numbers and information and transform it into something meaningful and easy to interpret. The easier we make it for a client to get the “gist” of their data, the easier it is for them to press forward with their programs and research and make bigger impacts on the community. I am very lucky to be a part of the CRC family!

Dani:  I love data sleuthing, and when the murky aspects of a program are translated into clean cut order via indicator grid, logic model, etc.

trohv9

Mandi:  What I love about evaluation? Well, many things! For starters, impressive data visualizations, I love how they enable you to blend (in a way) science with art. Next would have to be “knowing what works.” As evaluators, we become experts in a variety of fields (sometimes even overnight) and can tell you what will work and won’t work when it comes to achieving a certain goal. I love the opportunities evaluation work provides to learn about new things. Also, attending evaluation conferences– it’s great to meet, share ideas with, and learn from other evaluators! Lastly, and probably one of the more mundane tasks, but cleaning data – I find it relaxing. In a nutshell, I love evaluation for the great visuals, multitude of learning opportunities, and meditative properties that come with data processing.

Trohv2

Sheila:  The best part about being an evaluator is working with clients who are passionate about the work that they do and being able to help them enhance how their programs work. I also like that CRC is based in “Smalltimore” so I get to work with a lot of nonprofits and different groups of people. Every project offers the opportunity to connect with fellow Baltimore residents and to learn something new.

Dana:  What I love about evaluation is working as a collaborative partner with our clients to assess the efficacy of their program or intervention, identify successes and find creative solutions to overcome any barriers. As part of my work at CRC, I enjoy getting the opportunity to use my academic knowledge and creative skills to come up with different ways of presenting information in a useful and meaningful way. I also enjoy sharing and discussing ideas with my colleagues and getting to see and hear their perspective on different topics or issues.

Jill 1

Jill:  I love working in evaluation because I like to unpack why programs work…. or don’t. And I like that at CRC I have the opportunity to really explore different ways of both uncovering and uncovering meaningful information that great programs can use to show what they’re doing well and improve what needs improving.

Sarah:  I really love the fact that we work with so many local organizations at CRC. I’ve only lived in Baltimore since 2005, but it’s become home for me, and it’s nice to know I am working on projects designed to improve Baltimore City and the surrounding communities.

Matthew:  I like the fact that everyday that I come to work, I learn something new about my adopted hometown of Baltimore.

Ilse:  I love coming to work every morning because my humans at CRC are AWESOME! The treats are also a bonus.

trohv10

Taj:  I love working with such an amazing team of dedicated, innovative, and fun people. Everyone here knows how to work hard and laugh hard, and it’s a wonderful combination. I also love it when we produce something for a client that they find profoundly useful, whether it’s a straight report from a database, or a beautiful map or data visualization. We recently provided a client with a report that would allow her to identify students in her school-based program who were at risk of dropping out and intervene early to help them. Her face lit up and she kept repeating “This is so useful! This is so great!”. When we can do that, it makes my day.

What do you love about the work that you do??

(And to wrap up our ode to love, here are a more few images of CRCers Leslie, Sheila, and Jill celebrating at “Love Notes”.)

lovenotes

Secrets from the Data Cave: January 2014

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

January 2014: Using Data to Convince Your Friends You Are Psychic on Super Bowl Sunday

Here we are in 2014, with February already nearly upon us, and as Super Bowl Sunday draws closer, I hear more and more people making predictions about who will win it all this year. Whether your team wins or loses (or didn’t make the playoffs), you can still come out on top by convincing your friends that you are psychic due to your uncanny ability to predict Super Bowl winners—and all you need to do it is a little data.

Now, full disclosure here: I know little about the game itself, and mostly show up to Super Bowl parties for the nachos, so I’m not accounting for the teams’ actual performance this year.

(PRNewsFoto/RiseSmart)

(PRNewsFoto/RiseSmart)

However, by using a single indicator from the Bureau of Labor Statistics (as reported in a study by RiseSmart1), odds are that I could still accurately predict the winner. It has been found that the team with the lower unemployment rate won 20/25 of the last Super Bowls, making it a predictor with an 80% success rate for every Super Bowl since 1989.2

A low unemployment rate reflects the economic wellbeing of the team’s city, and the CEO of RiseSmart has speculated that “One could argue that a fan base with higher unemployment is more likely to have expendable income to attend games, buy team merchandise, and cheer their team on at sports bars and restaurants. Maybe that addition fan support gives their team an edge that ultimately drives them to Super Bowl victory.”2

I’m not so sure about that theory, since the unemployment rate entire city is not necessarily representative of the unemployment rate among the actual football fans for that city (as opposed to those who only care about the nachos). But no matter the reason for the apparent relationship between low unemployment and Super Bowl victory, you can still use data to look like you are psychic, and that is pretty awesome.

 

Check out the data summary here: http://www.prnewswire.com/news-releases/seattles-12th-man-will-make-the-difference-in-super-bowl-xlviii-based-on-a-surprising-predictor-of-big-game-success-241307221.html and enjoy the game!

 

 

 

 

 

 

 

Sources:

  1. www.risesmart.com
  2. http://www.prnewswire.com/news-releases/seattles-12th-man-will-make-the-difference-in-super-bowl-xlviii-based-on-a-surprising-predictor-of-big-game-success-241307221.html

Happy New Year! And AEA 2013 Recap, Part 2

Written by CRC on . Posted in , CRC Team

compiled by Jill Scheibler

Happy New Year from all of us at CRC to our friends, evaluation colleagues, clients, and miscellaneous blog followers! 

If you’re like us, the transition period spanning the end of one year and the beginning of the next is a very busy time masquerading as a relaxing one. But, even as we’d all like a bit more down time, the busyness of this transition time is a fertile one for reviewing lessons learned from the departing year and for setting goals for the new one…

…and, with that in mind, we’d like to share (with apologies for the delay) the second half of CRC’s recap of our experiences at the 2013 AEA conference! So, without further ado…

Tracy, Research Associate, provided the following reflections on her experience at a session on the topic of being mindful of when evaluation is appropriate:

There seems to be this push to evaluate everything these days but, in some cases, a program may not be ready for evaluation. Maybe the program is not fully operational, or maybe there are no efficient ways to collect the necessary data. Regardless of the reason, it is important that we as evaluators consult with our clients about the best approach to obtain the most meaningful information. In some cases, this may entail conducting an “evaluability assessment” to examine whether the program is suited for evaluation. The timing was a bit uncanny, but after hearing this discussion at an AEA plenary session, a few days later our office was presented with an RFP to assess the feasibility of conducting a randomized controlled trial (RCT) for a national learning initiative. I was left thinking, “Bravo!” to this organization for their willingness to invest time and money to determine whether it is appropriate to pursue this approach.

Sarah, Database Analyst, took an Eval 101 course to brush up on fundamentals:

The course was really informative and focused on some of the broader issues facing evaluators—a big change for me, since I usually focus on the minutiae of collected data. In particular, the instructor raised some questions of ethics in designing evaluations: for example, if an experimental learning program is tested in public schools through a longitudinal study, and it was determined that children getting the experimental services were far more successful than the control group, is it fair to keep children in the control group from accessing these services? I knew that similar issues come up in clinical studies (e.g., if a new experimental drug proves to be vastly more effective than the current standard of care, the option of taking this drug should be offered to the control group), but I never thought about how this might apply in social services, where the measure of “success” is often less cut-and-dry than experimental drug trials. It got me thinking—where do we draw the line? Virtually any experimental program aims to improve some aspect of human life, so who decides which improvements are significant enough?

Sheila, Research Associate, and Jill, Research Assistant, attended several informative sessions most enjoyed attending the annual business meeting of the AEA’s Data Viz topical interest group (TIG). They were excited to be part of the discussion about how the TIG will focus its efforts in 2014, but more so were pleased to continue their roles as “support crew” for Taj’s Ignite presentation, informally known as WE LOVE MAPS. Taj had just 5 minutes to demonstrate why data nerds (like us at CRC) need to be using more maps…and how to know a good map from a bad one.

AEA Ignite 2013: WE LOVE MAPS

Check it out! And stay tuned for Taj’s next engagement at Ignite Baltimore this spring!

 

Secrets from the Data Cave: December 2013

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

December 2013: Working while traveling this holiday? Read this first!

It’s the holiday season, which means that lots of people are traveling to be with family and friends. Maybe you’ll have a long layover in an airport, be staying in a hotel, or maybe you’ll try to escape to a local café when it comes time for your crazy family to force everyone to sing the 12 days of Christmas (oh wait, that may just be my crazy family).

Photo by slambo_42 on Flickr

Photo by slambo_42 on Flickr

But no matter what you have planned for the holidays, you may be tempted to use an unsecured Wi-Fi “hotspot” that you’ll find in hotels, rest stops, airports and coffee shops all across the nation. And if you deal in sensitive data (or want any privacy on your email or social networking sites), you should know that public wireless networks come with significant security risks. There are, however, a few things you can do to protect yourself while using a public Wi-Fi connection, which all boil down to one key thing: making sure your browser is secured at every juncture.

If a web address begins with “https” instead of just “http,” it means it’s http secured, and employs Secure Socket Layers/Transport Layer Security (SSL/TLS). SSL/TLS uses public key cryptography to encrypt the data communicated between client and server so that hackers can’t intercept the information (read about public key cryptography here, where I explain public/private key encryption). So you should always check your web address bar at the top of your browser to make sure you’re in a secured browsing session.

Keep in mind, though, that while some websites will redirect you to a https page if you only enter part of the web address, the redirect is a juncture at which a hacker can intercept your information.Facebook is one such site—if you type in “facebook.com,” you’re typing in an (abbreviated) http address, which is then redirected to https, and your login information could be compromised at that point.1 You can see how a hacker would “sniff” out such information in this great article from the Grey Hats Speaks blog.

So how do you protect against this? Aside from bookmarking the “https” version of every webpage and/or typing in the full web address for sites with the “https” in front, you can get the HTTPS Everywhere add-on if you’re using Firefox, Google Chrome or Opera web browsers. This will at least help to activate security features on sites that are compatible with the encryption technology, but it doesn’t mean you’re safe on every site on the web. You can also find other tips for protecting yourself in this PC Mag slideshow on Ten Tips for Public Wi-Fi Hotspot Security.

But my advice? Stick to private WiFi unless it’s an emergency. And by that I mean a data emergency. Being forced to sing the 12 Days of Christmas does not count as a data emergency.

Happy holidays, and see you in the New Year!

Source:  http://www.greyhatspeaks.com/2013/10/mitm-against-https-sites.html

Out of School Time Programs in Baltimore City

Written by CRC on . Posted in

by Sheila Matano

The Baltimore Education Research Consortium (BERC) recently released a new report on schools in Baltimore City that provide out-of-school time activities. All the schools in the reports were funded under the Community Schools Initiative at the Family League of Baltimore City.

One of the key outcomes for the OST programs was attendance; the report showed students who attended OST on a regular basis had a slightly higher school attendance rate than their peers who did not (95.0% compared to 93.0%).[1]

Another key outcome was that students who attended OST were significantly less likely to be chronically absent[2] from school in 2011-12 than their peers; 62% of regular OST attenders were no longer chronically absent compared to 51% of their comparable peers. In other words, the number of regular OST attenders who were chronically absent went from 215 to 71 whereas the number of regular OST attenders who were severely chronically absent[3] decreased from 33 to 24 as highlighted in the visual below.

OST_blogviz_fin

Challenges in Evaluation of OST Programs

Structured, well-implemented high quality after school programs have the potential to support and promote a healthy learning environment. Programs that have shown promising educational outcomes must address key factors such as access, sustained participation, program quality and strong partnerships. However, there is still much to be learned about what factors work to support students and what outcomes should be tracked as a measure of educational success.

What other interesting findings do you see in the report? What are your thoughts on what makes a Community School successful?

 


[1] Differences were not significant among high school students.

[2] Chronic Absence is defined by MSDE as students enrolled for at least 90 days who miss more than 20 days. In the report, BERC defines chronic absence as missing more than the equivalent of one-ninth of days (or 20/180) of days on roll.

[3] Severe Chronic Absence is defined by MSDE as students missing more than 40 days a school year. It is operationalized in BERC’s report as missing more than the equivalent of two-ninths (or 40/180) of days on roll. 

 

Secrets from the Data Cave, November 2013

Written by CRC on . Posted in

by Sarah McCruden

Welcome to CRC’s monthly series of articles on all things techie: Secrets from the Data Cave! (For those who don’t know, the title references our room — fondly referred to as “the bat cave”— where data staff can geek out in an isolated setting.) Here we’ll be offering you a fascinating sneak peek into the cave, with the latest updates & tips on what we’re implementing here at CRC!

November 2013: Why I’m Switching to Decaf

Cup_of_coffee_with_beans

I’m kind of obsessed with tracking my progress to goals. And not just goals at work, or long-term goals in my personal life—I mean little, day-to-day things. I’ve even been known to track the amount of time I spend cleaning my apartment each day, because without documentation to keep me accountable, I would simply never clean. Obsessive, I know, but effective.

So I was super excited when I found this nifty website, which makes progress to a goal (or anything really) so easy to track that even non-data nerds are using it: www.askmeevery.com.

You just enter your email address on the AskMeEvery site, type in a question for whatever it is you wish to track (like, “How many hours did I sleep last night?” or “How many miles did I run today?”), and you’ll get daily emails or text messages asking your question. All you have to do is reply to the email/text, and it will track the data for you!

I decided to record my productivity at the end of each workday, which meant that around 4 pm Monday through Friday, I’d get an email like this:

 email

 

 

And after 2 weeks of this, I logged into the AskMeEvery site and it displayed my productivity ratings over time:

 

 

productivitychart 

 

But the coolest part is that you can compare the results of multiple questions, and analyze them to see if there is a statistically significant correlation. So I tossed in another variable: how much coffee I had consumed each day.

And here’s what I found*:

 comparisonfinal

large effect

 

So it seems that the more coffee I drink, the LESS productive I become. And that is why I will be switching to decaf!

Check out askmeevery.com and let us know if you learn anything about yourself, and I’ll see you next month!

 *Note: I realize that my methods are flawed here, but this is just for fun  ;)

AEA 2013 Re-Cap, Part 1

Written by CRC on . Posted in , CRC Team

compiled by Jill Scheibler

CRC was well represented at the 2013 American Evaluation Association conference, held close to home for us in Washington, DC! We learned a lot from this year’s sessions and had a great time connecting with old and new friends… But we didn’t just party there! (Although we did do just a bit of that with our fellow East Coast evalutors… see the evidence at the end of this post.)

AEA_2

Most of our staff attended AEA 2013, and each has something to share about what she learned there. Together we have so much to share, actually, that we’ll be splitting our conference post-mortem into two parts. Stay tuned for part 2, coming next week!

 

Dana Ansari, Research Assistant, attended a plenary presented by John Easton, entitled, “The Practice of Educational Evaluation Today: A Federal Perspective.” Her key take-aways from the presentation were:

1. Working partnerships between evaluators and stakeholders are important for making research both pertinent and functional.

2. Formative evaluations are useful in gathering feedback and identifying a program’s strengths and weaknesses, which can then be used to improve future implementation efforts. Randomized controlled trials are useful in making casual linkages; however, they may sometimes lack the ability to capture the reasons behind the why and the what of a given intervention makes it so effective.

3. Drawing from various research and evaluation approaches can help evaluators choose the most effective method for program improvement and success

Leslie Gabay-Swanston, Research Analyst, distilled a couple of take-aways from different sessions that she attended:

Number one, was that a distinction should be drawn between assessment and evaluation:

Assessment = What do we know?

Evaluation = How do we know?

Evaluation = How do we know?

Within a circular process, assessment and evaluation use the same elements, just in a different way.

Number two, was a set of useful distinctions between evaluation types:

Collaborative Evaluation =  Evaluators are in charge; there is ongoing engagement between evaluators and stakeholders.

Participatory Evaluation = Control is jointly shared; participants are very involved in the evaluation process.

Empowerment Evaluation = Participants are in control of the evaluation; the evaluator is a “critical friend”.Related to empowerment evaluation, an ongoing challenge for evaluators is how to help participants to be comfortable and confident enough to carry the evaluation forward.

Michael Quinn Patton (aka, Sheila's "best friend, MQP") presented this year, as he often does.

Michael Quinn Patton (aka, Sheila’s “best friend, MQP”) presented this year, as he often does.

 

Mandi Singleton, Research Assistant, attended a workshop entitled “21st Century Strategies for Conducting Excellent Interviews.”  It provided pointers for conducting long interviews, presented the concepts of “companioning” and motivational interviewing (MI). Mandi learned that:

  1. “Companioning” involves practicing effective listening skills that aim to increase the quality of responses. This process focuses on what virtues you – as the interviewer— bring to the table (e.g., being aware of your own biases, respecting the interviewee, maintaining focus, practicing open-mindedness, non-verbal communication/body language, interest and engagement). The interviewer should exude: 1) compassion – to actively engage the interviewee, and 2) detachment – understanding the interviewee while not taking on their emotions.
  2. Motivational interviewing is powerful in combating cases of resistance (i.e., a lack of agreement on goals between interviewer and interviewee) in participants. In MI, interviewers should focus on the dimensions of:
  • Collaboration rather than confrontation (e.g., engage as partners; don’t confront as to how they should change)
  • Evocation rather than education (e.g., evoke from participant, don’t push them to say)
  • Recognizing participants’ autonomy rather than expressing your authority, making them the agents of change and experts of their own situations

Mandi also picked up a few pointers on how to increase participant engagement and reduce dropout when conducting long-interviews:

  1. Consolidate
  2. Focus on relationships and building trust with the interviewee
  3. Be clear (on time and content); transparent
  4. Clarify own goals to get richer data (focus on quality vs. quantity)
  5. Avoid leading (leading interview to get the answers you want)
  6. Leave space for open-ended questions
  7. Break-up sessions to reduce interviewee fatigue
  8. Provide incentives
  9. Be sensitive to timing, make it convenient for participant
  10. Create buy-in, explain to participant how it will benefit them

 

We hope the first part of our AEA 2013 re-cap was informative for you! And now, for our “happy snaps”:

WP_20131017_018 2
Our good buddy and collaborator, Nichole Stewart!
WP_20131017_016
CRC’s own Leslie, Jill, & Taj
WP_20131017_013
A happy group of NYC evaluators!
WP_20131017_011
East Coast evaluators were willing victims of Sheila’s camera.
WP_20131017_012
Chris Lysy and Stephanie Evergreen- always keeping evaluation visually interesting!
WP_20131017_010
Stephen Axelrad and Taj
WP_20131017_014
More willing victims for Sheila’s camera and delicious wine.

 

CRC_logo_footer

twitter_social_mediafacebook_social_mediayoutube_social_media
google_social_media