A Blueprint For Your Nonprofit Data

Episode 68 March 12, 2024 00:40:24
A Blueprint For Your Nonprofit Data
Heart, Soul & Data
A Blueprint For Your Nonprofit Data

Mar 12 2024 | 00:40:24

/

Show Notes

Today we explore the Practitioner Data Initiative (PDI) and what we can learn from their effective approaches to improve our own nonprofit data. We discuss the importance of taking a holistic approach to data, including measuring impact, collecting and managing data, and leveraging technology. Erin shares her organization's journey with PDI and the improvements they made in data collection and cleanup. Alberta explains how PDI helps organizations identify gaps and develop skills to achieve the next level of data. We'll also dive into recommendations for organizations to start their own data journey.
 
Takeaways
  • Taking a holistic approach to data is essential for nonprofits to measure impact and make informed decisions.
  • Building comfort and confidence with data requires a combination of technical capacities and a data culture.
  • Prioritizing and making incremental improvements in data collection and processes can lead to significant progress.
  • Defining terms and asking the right questions are crucial for effective data measurement and analysis.

Bios

Erin MacKenny has a background in education and close to 10 years of experience working in poverty reduction in the non-profit sector. Throughout her career she has developed a passion for evaluation as a means to understand impact. Currently she is the Program Director and Measurement, Evaluation and Learning Lead at the Saint John Learning Exchange in Saint John, NB Canada. In her role she leads the charge in developing stronger and more efficient data collection processes, creating and incorporating new tools, building the organization's data culture, and getting the team excited about the impact of their work. She loves to make talking about data fun and has been known to throw a great data party! 

Alberta Johnson is the Manager of Data Solutions and Strategy at Blueprint. She leads a team of amazing humans that work on both the Practitioner Data Initiative and Blueprint's Data Solutions team.The Practitioner Data Initiative is a portfolio of projects funded by the Future Skills Centre, that seeks to develop a set of tool to improve data capacity in the non-profit sector. This includes managing a team to work directly with non-profits and engaging in a set of strategic sector-wide research activities. Blueprint's Data Solutions team implements, manages, and integrates Salesforce to support Blueprint's various research activities. Using agile methodologies, we work with every research team across the organization to support the distribution of surveys and incentives, the tracking of administrative data for programs, reporting at the project/program level, and data security and management practices to ensure the protection of participant's research data.

Chapters

00:00 Introduction and Background
01:21 Practitioner Data Initiative (PDI)
03:08 Importance of a Holistic Approach to Data
04:05 St. John Learning Exchange's Journey to PDI
06:21 Developing Skills to Achieve Next Level Data
07:38 Building Comfort and Confidence with Data
08:07 Improving Data Collection and Cleanup
09:26 Identifying Gaps and Interventions
10:51 Making Informed Decisions with Data
12:38 Prioritizing and Incremental Improvements
13:30 Discovering Unexpected Gaps
14:53 Standardizing Processes for Consistency
16:19 Balancing Data and Human Touch
18:20 Recommendations for Organizations
20:31 Taking the First Step and Being Brave
23:42 Developing Effective Learning Questions
26:05 Importance of Curiosity and Interest
30:31 Defining Terms and Prioritizing Questions
36:08 Conclusion and Contact Information

 

View Full Transcript

Episode Transcript

[This transcript is auto-generated and may contain errors] All right. Well, thank you so much for joining me today. I am very lucky to be the host of two wonderful, wonderful guests. And we are going to talk about some extraordinary ways that we really can capture the incredible impact that our nonprofits have. So I am going to let my wonderful guests, Alberta and Erin, introduce themselves and where they come from. So Erin, why don't you start? Sure. I'm Erin McKenney, and I live in St. John, New Brunswick, Canada. I work for an organization called the St. John Learning Exchange. We are a kind of small nonprofit. We've got about 26 staff total. We work with education, employment and empowerment programming. My name is Alberta Johnson. I work in an organization called Blueprint based out of Toronto, Ontario, but we work with organizations all across Canada. And I run a portfolio of projects here under the banner of the Practitioner Data Initiative. And so we're an evidence and kind of research organization that works with not-for-profits, government, and other types of organizations to kind of manage and deal and kind of answer and problem solve around things related to their evidence. Yes, and the practitioner data initiative or PDI is definitely one of the things that connected us. And this is an amazing program that I've immediately fallen in love with. So can you tell us a little bit more, Alberta about exactly what PDI is? And then we can talk to Erin about how it works. Yeah, so for about two, two and a half years, we've been working on this program or this initiative. It's a pan-Canadian initiative funded through the Future Skills Centre, which is a an organization in Canada that is funded by the federal government to help future proof work or do research around innovative work, figuring out like what the like for AI for automation and those types of things. So we work with 15 under this banner we work with 15 organizations across Canada from different sectors from different sizes. Some of them are like just do employment programming. Some of them do multi service delivery. and we award them a small 150 or 200K grant from the FutureSkills Center. And then we provide them about one to two years of expert support from the Blueprint team and also with our implementation partners, AJA. So this is not just like support to implement a data system or like to kind of talk about your IT needs or really just, it's not just focused on technology. We also like to have conversations with our partners about how are you measuring the things that are part of your impact? How are you thinking about measuring things in your strategic plan? How are you collecting and viewing that data and sharing that data internally? And really thinking about the holistic view of, like, collecting, synthesizing, storing and managing data. And how are you using that and leveraging that within your organization? not just through the use of technology, but also through like kind of a data culture or data integrating that really well into how you're using it in your organization as a whole. So that's really what we're doing and we're really excited to be doing it because you get to do a lot of different types of things with a lot of different types of organizations through that lens. So yeah. Exactly. And I think that broad approach is... uncommon and yet so powerful because all those pieces are necessary but not sufficient on their own, right? You do need tech to be able to collect data, but just having the tech to collect data is not enough, as you said, thinking about how you define the things that you measure, how what you're measuring is going into your strategic planning and your operations. All of those elements are just so important. And so to see them brought together in a program is one of the things that I found so exciting. So, Erin, how did you find, how did your organization find its way to PDI, what brought them there? So we heard about the call for proposals and we had been doing a lot of work. We did a social return on investment analysis, which wrapped up in 2019. So we had some background in collecting data and we felt like we were okay at measurement. But we called our project Next Level Data, because for us it's all about taking everything to the next level. And so we applied for the grant in the first round and we were very excited to be successful and to have been one of the projects chosen. Like I said, I worked for a relatively small organization. We have a nonprofit side and we have social enterprises and in total, I think we have about 26 staff. So we're not gigantic by any means, but we do work with about 400 individuals in our community. And we're always interested in trying to figure out Number one, are programs having the impact that we want them to? We know that people are getting jobs and that they're keeping them because we've done a lot of work to measure that and make sure we're tracking it. But what is the overarching impact? St. John is a city that has a lot of generational poverty. So we want to know, are we moving the needle when it comes to generational poverty? Are we helping to break those cycles? And we want to be able to back that up by data because we know that having good sound information is how you can do number one, innovative work, but especially to get money to do that innovative work, right? In the non-profit world, it's all about being able to get those grants. So that's sort of how we found the project. I think it was just an open call for proposals that came across our desks, and we put it in an application and we were chosen. I love that you say you were ready to take that step to the next level of data. I think that speaks a lot because, you know, what Albaric is talking about, you're not going to do that on day one. You're not going to have the sophistication, even the knowledge of what you're trying to do, of your theory of change, where you're going. And so this idea that you had already laid the groundwork for it and were ready to take that next step and continue to improve. So Alberta what are some of the skills that you would work with Erin and her team to try to develop in order to achieve that next level of data? Yeah, so we spent a lot of time in the kind of a discovery mode where we have like lots of really in depth conversations about what is your impact? What is the story you're trying to tell? What are the things that you're really curious about that your organization is doing that you really want to understand in a better way? And and really, and with specifically SJLE, we did a lot of that kind of conversation, then how do we actually implement technology to kind of support those questions? But across the whole portfolio, we tend, I think our overarching goal is really like, how do we make people or organizations more comfortable about making decisions about their data, about their infrastructure and about all of those things? We want people to feel like they are empowered and can use and manage and make those decisions in a way that's confident instead of just spinning their wheels. and really understand how everything is kind of connected. Like the storytelling about your impact is connected to the data that you collect, which is connected to how you're kind of interacting with your clients and how you're kind of storing and making that data accessible to your staff so they can use it on their day-to-day pieces. So there's a lot of like kind of really technical capacities, but overarching, we really want people to get that kind of like level of comfort and ability to make decisions, like good decisions about their data. Yeah. prior to joining PDI, Erin, what would you rank your organization's sort of comfort with using data in decisions? And then how has that changed over your engagement with them? Yeah, I think, well, because we had gone through the social return on investment analysis, which is a really, it was a five-year, long-term in-depth program, I feel like we thought we were really strong at making those data-driven decisions and that we were collecting really good data. But through that discovery phase and through the conversations with blueprint, we really started thinking about, are we collecting things because we actually need this information or are we doing it just because we had been? And so I think we improved significantly. We really cleaned up a lot of the things that we were collecting. We have a database that we use in-house called Outcome Tracker. And one of the things I love about it is that I'm able to go in and like. take questions out and add questions in. So I have a lot of control over the data points that we're gathering. And I definitely did a lot of slicing and dicing, a huge amount of cleanup, because part of taking the work to the next level is looking at what you're already doing and kind of deciding what you can let go of so that you can do something more. I love that. What can you let go of in order to bring something new in and take that next step? I don't think that that's often the first thing we think about. When we think about taking our data to the next level, I know I'm guilty of just wanting all the data, right? And so that idea that actually letting go of some of the things that you're collecting that may be taking you away from your biggest focus, that's a great point to think about in that cleanup phase. So, Alberta, how do you help organizations think about whether it's that cleanup phase? whether it's the other barriers that you see, what are some of the interventions you've seen support organizations in making those steps to get better data and get more comfortable with making those decisions? Yeah, so I think it really varies and it's really driven by the partner and the partner's wants and desires and what they really want to the program. So we work with, so we do very similar work in the discovery phase in terms of like, let's actually define and write down what you think you do, and then talk about how you're measuring or telling the story around that and collecting data around that. And moving from there, then we talk about, okay, so where do you think your gaps are? Or what do you think you need to enable you to actually collect that data and tell that story? And that could look like, Erin, I know with you, we talked a lot about qualitative data as well, because that's also a very important aspect. How are we collecting that? Are we... putting a form, how are we making sure that that's a kind of like contained in one space where we can access it and people can kind of go through it and use it rather than it living around in people's heads at the organization. Do you need to either improve the data systems that you have or the data infrastructure you have. Do you need to think about how you're collecting data and when is the right time to be collecting certain data from participants. A lot of folks that we talked to are talking about like, we want to make sure that we're investing in the right program we want to make sure that we're using our resources wisely because they're not for profits don't have endless resources. So how are you kind of doing that math and figuring that out. Using either technology or somebody sitting in a room and doing that math to answer those questions and we try and figure out. what that looks like for them, what their ideal state is, or even like what the kind of like next level for them is, is like, well, yes, you wanna do machine learning, but like first we have to start at this like very, very simple level where you're like able to consolidate your data in a way that could eventually be read by a machine to do learning. So really starting to think about making measurable incremental improvements with them, either through infrastructure, skills development, or even just kind of conversation so we can kind of think and frame things about. what's next for them to be able to get to where they want to go. Hope that made sense. Absolutely. No, and I think you're hitting on a point that often I feel like I frustrate people that I talk to about data and getting stronger and being able to make decisions with data, which is we want the silver bullet. We're like, tell us the one thing that if we just do it, then we'll be good with data. And yet to your point, there's not just one thing. that we need to do that's going to help us be better. You mentioned so many different spaces where gaps might be, or maybe in that your data are too focused, adding the qualitative data if all you have is quantitative data, or your idea of saying, hey, how do we move this from just measuring outputs to really understanding which program to invest in? That's a big jump in the robustness and sophistication of the data that you're measuring. It could be a skills gap. Maybe the data are there, but people aren't comfortable in working with it or analyzing it the right way. And to your point, it could just be that their data aren't even clean enough or consistent enough to be able to do any of the higher level activities and that those steps all need to be put into place. So, you did a good job kind of outlining that whole journey that we all need to take and it would be nice. It's kind of like getting fit. It'd be really nice if we could just wake up and take a pill and everything would be OK. But it's not. There's all these steps you have to take, and so many that have to happen first before you can take the later, more exciting steps. So were there any gaps that Albert and her team at the PDI helped you find, Erin, that surprised you, where you weren't expecting that to be a gap or something that you needed to address before you could get to that next level, or in your process, I should say, of getting to the next level? I don't know if it was necessarily a gap, but one of the big things that we did through that discovery phase and one of the pieces that was sort of handed over to us was developing learning questions. So really thinking about what is it that the organization is curious about. One of the goals of our organization throughout the process of the PDI was to develop and implement a learner management system. And so we are kind of in the phase two of that right now. So we were able to build the frame in the first year and start thinking about what is this going to do for us and how is it going to help us sort of feed into the goals of the organization. So we sort of, we were able to see that. We were able to see what we're not. gathering right now so that we could think about what we need to do. And then I think the other thing that it kind of really shined a light on is processes. When you're a small organization, especially I think it's very easy to like not really have processes because everyone kind of does everything. I'm the director of programming, but I'm happy to jump in and be a supply teacher or, you know, do intake for a program that I don't really work really closely with. And so... We all wear a lot of different hats. And so really thinking about how a process feels standardized that we know we're getting the information that we need, but also is that it's still human because so much of what we as an organization value is that human interaction. And that's that light, warm touch that we have with our learners because that's what helps them feel safe. So how do you gather a lot of information from someone and also make them feel really safe and comfortable in the space? And... What do you need to do at this moment versus at that moment? And it only works if there's actually a standard process that you're following. So that was probably the biggest, now that I'm talking, that would have been the biggest gap that we would have probably realized that I don't think that I knew existed because, you know, I thought I was great at that. Well, and it's interesting that you bring up this, the sort of standard operating procedures of how we do what we do in our nonprofits as a data problem. Right, that many of us might not even think that would enter into the realm of a data discussion because that's programs that's you know whatever that's operations, we don't have to worry about it from the data point of view but to your point. Right, if you don't have a standard way of doing something you have no hope of then even defining what it is you're measuring because there's no consistency in the thing that you hope to measure, let alone how it's being measured. So, I find that really interesting, and also that you brought up the point of the human touch. we do what we do because we care so much about the people that we serve. And it can feel sometimes like, Oh, if we put a process in place, somehow that dehumanizes it. We have that conversation about data a lot, that somehow data is dehumanizing. But as you said, no, this actually increases the level of service that you can offer and the support you can do and in honoring those that you're serving because you're doing it the best possible way that you guys have identified. Yeah, exactly. I think that one of the things that when we were rolling out the processes, because A lot of us have been there for a long time. So there's always like some hesitation or trepidation around changes, especially if it's to something that you've been doing a certain way. So it always comes back to if we do things this way, this is how it will impact our learner because the learner is why there we're there. So if we follow this internal referral process, for example, and we make sure that these things are up to date. and these things are put into the database, then when the learner transitions to their new program, they're actually able to transition more smoothly and they feel like we actually know what we're doing and that we talk as a team. And so it's all about how you kind of present the process, I think, to the team so that they also see, oh, it makes sense to do this. It's actually more effective maybe. No, and that focus, like you said, that just keeping, those do serve at the center of everything that you do. is a great way of making sure that those choices do end up linking back to that, that you're aware of even the unintended consequences of keeping on doing the same way, right? Versus making a change, because you could see that impact in having disorganized ways or inconsistent ways of doing something. So one question that I also had, I mean, there's so many great results that we're already hearing from Erin in terms of the changes that her organization's being able to make, but. Albarza, do you have recommendations for those of us who aren't lucky enough to be one of the 15 that get to work with you? I mean, what would be some of your recommendations for things that organizations could start to do on their own to be able to achieve some of those great steps of progress that Erin's made? Yeah, I mean, I think there's like, there's so many. And the thing is, is like, it's never this, it's like the very unsexy stuff we don't talk about. Like, it's like, not that like, yes, we're going to implement this beautiful CRM, and it's going to solve all the problems. It's like, no, like, you need to think about, like, sit down and think about, what are your actual, like, what are the things you want to know about your organization? How are you coming to an agreement as on those as an organization? And then also, like, how are you collecting and using data? Like think about the actual, like make a list of the data you're collecting, which is like, again, not a very sexy thing to do. One of our consultants talks about it as like eating your vegetables. It's a thing you have to do, but it's not a thing that's always like fun and exciting and tasty. So like really think about doing that kind of like pre-work that seems very like boring before you embark on doing any sort of like significant technology changes. significant process changes, like really understanding what is currently going on in its most truthful way before you decide to move forward with making any sort of changes. And really thinking about what pieces of those processes or what pieces of those data are actually serving the purposes that you've identified that you need to be doing. I also really love, we had one organization that's like had another talk with us as well. And her biggest recommendation is just like, be brave. Like, yes, it looks really scary and yes, it looks really like overwhelming. But once you start to actually kind of do it, I think you get, you get kind of into it. You start to understand it more and you kind of build your own capacity. So kind of like that bravery and that courage, I think is also really important. And, and the organizations that we've worked with that really have somebody who's like really to dive feet first in like very much like Erin has like have really been successful and really been able to go so much further because they have somebody kind of at the helm that's like really curious and wanting to learn stuff and kind of move forward with that work so that's also been very exciting and like very like humbling to see across the portfolio. So, yeah. You have anything that you think helped you be brave, Aaron. Um, I think that. learning exchange loves to take risks. And so my approach to work is like, if it doesn't work, that's okay. But I rather would have tried something and have it not work out than not try it at all. And so I live that way, my personal life, and I live that way at work too. And so I think that, you know, getting it wrong is also good. Right? So if you approach everything as a learning opportunity, then every decision you've made is the right decision. So, you know, when we first started out, we didn't necessarily think that we were going to have the questions that we ended up coming up with. And that's okay. Things that we thought were really important to us, we realized through the conversations with Alberta and the team that maybe that wasn't as important as we thought. You know, we used to talk about... barriers a lot as being a really important part of our work. And we've sort of started to shift our mindset around that because maybe we don't want to define people by their barriers, maybe we want to define them by their successes. But if we hadn't actually had conversations around that and started thinking about it, we may not have come to that conclusion. So I think just being okay with being wrong is probably the thing that helped me be the bravest. And then also having... Alberta and the team to talk to and be like, I feel like maybe this is really wrong, is it? And for them to guide you helps too. Alberta, do you have a framework for helping people feel more comfortable and maybe making some of those mistakes? And I'll put mistakes in quotes. If those you can't see, definitely mistakes in quotes, the learning opportunities, Erin, I think that's a great framework for it. Yeah. I mean, like I've been, I've been doing kind of like this implementation data infrastructure work for a while and I have made lots of mistakes and like nothing has, nothing terrible has happened from those mistakes. Like, yes, they cost me time. and probably money and a few other things, but you learn a lot from them. It's like, oh, you're never gonna do that again. Or like, that's a thing that you know you need to clock when you are doing that problem solving. And I think it's just showing up and being like, it's okay to make mistakes and being very honest about kind of the world in which we live in. Like, you're not gonna get very far if you try to create a perfect process. Like perfect is like the enemy of being done or good. So you kind of like, having those kinds of conversations and even just providing some of that emotional support for some of our partners is really exciting as well because it's like, no, this is hard and you're doing a hard thing. And I think that's really, really wonderful and really, really good. And acknowledging that that's the kind of space that we're living in, I think, is the most important part in being honest about where everyone's at. And it is hard, it is challenging, but that doesn't mean you shouldn't do it. That's why it's so much better when we actually like do succeed or you do learn a thing and that's where kind of everything comes out of. So yeah, I'd like, as Aaron said, just, just do the thing, try the thing and then see what happens. That's the best part about all this work. And I know that Alberti, you mentioned this idea of helping people figure out how to ask the right questions and that that's one of the first steps is, what are you curious about? And are those effective questions? And I would be curious a little bit as well of like, how do you help that conversation go? I mean, how do you understand if you're asking the right questions or, you know, how you're, you mentioned even understanding how you come to agreement about those questions. So do you have any recommendations about that process? Yeah, so we approach it in a very interesting way because we just get them, we just get our partner system, send us all of your documents. We want to read all of your documents and we will synthesize that for you and tell you what they say to us. And then you can tell us whether we're right or wrong. You can tell us what's like where the nuance is, like what makes sense, what doesn't. And I think that's just like a really helpful thing. Like when we give them this kind of like story that we've created about the organization and the services they deliver, they're like, this is the first time I think I've seen this laid out in this manner. And it's kind of like, oh, this is like very interesting. And it's kind of also a way for them to kind of think big picture instead of like, we work with a lot of people who are just like sitting in like certain pillars in an organization or like certain service delivery ways. So this is kind of like, this is how everything connects across the organization. So thinking kind of big picture and then like drilling down, I think is really important instead of trying to build up. And then really... we have a couple rules when we start to write learning questions where like we get them to like just put any question that you have about anything at all on the board, and then we'll go from there. And then we use that and then we're like okay now let's like refine these questions, according to this framework so that the question must have to be measurable has to be connected to your impact. And it needs to be relevant or like, like actually measurable so like either within a timeframe or Yeah, so you have to kind of create these questions that you can actually then create either survey questions out of them. You can identify metrics that like, they're actually something that you can actually pull in and make measurable. So it's kind of this process of being like, you can have all of the grand ideas you want, but let's figure out how to translate that into a way that can be leveraged to collect data or measure against data. And I'll look to Erin to be like, I don't know if that was your experience. You were also one of the very first organizations that we did a lot of that work with. So I'm very curious to see. if that was also kind of reflective of your process and if you have any thoughts. Yes, that's very much our experience. It's funny, I literally, I have our learning questions here and I have them with me all of the time because they are really like a guiding beacon now for the learning exchange and what we do and how we engage in our work. And so when we're making decisions about changes that maybe we wanna implement or programs we wanna bring in or... features we want to add, we start thinking about like, how does it tie back to these things? And is it going to help us answer the questions? And I think one of the things that was the most helpful for us, because I think by nature, like, I work for a very curious organization, and I feel really lucky that I do. But one of the things on the document that was given to us is there's a rationale. So what, if you can answer this question, what can you do with it? Which is really interesting to start thinking about. So There's lots of things that we're curious about. We're always asking questions, but then to think about what's the impact of asking that question. And if you can answer that question, where does that take you? That helps you be a lot more strategic in the questions that you ask. And also the ones that you prioritize, right? Because all of these questions are not being looked at equally. You can only do so much at a time. So you kind of have to pick something that feels like this will build some momentum and you start working towards that. And then the other piece is why can't this be answered right now? So there's what can you do? And then why aren't you able to answer this in this exact moment? And then you can start identifying some of those gaps and then you start thinking about what are some of the first steps and then an additional piece is what are some of the data that you can be collecting? What are some things that you can be doing to feed into the answering of the question? And so that's been laid out for us, which has been amazingly helpful. But I think you can do that too, even without maybe the guidance from an organization like Blueprint. What is it that you're curious about as an organization? If you could answer this question, what does it do? If the answer is, it just tells us some information, maybe that's not the most impactful question to sort of go after. And then what's stopping you from answering it right now is also a really interesting way to think about it, because it also makes it seem like, oh, this is achievable, I can do this, it's just this particular thing might be standing in my way. And yeah, so that's, it's funny that I have it sitting next to me, but that was not planned. Because I was looking at it earlier. It's something that's really been so useful and probably one of the most useful tools and documents that we got out of the whole process. Again, that goes back to what you were mentioning earlier of deciding what data to let go of so you could move on. This idea of prioritizing, I think, is such a critical one because we don't have resources to be able to ask all the questions. And I think that can be the space where we often do feel very overwhelmed. That like, oh my goodness, there's so much we have to do and how could we possibly accomplish all of this? How could I keep track of all of that data? There's no way, but... your framework really does guide you to say, okay, if we do consider the whole broad scope of everything that we're possibly interested in, and then we can narrow it down for those things that actually would lead us to an action that would be really impactful, and then we can identify which ones are actually achievable, right? If the why we're not doing it is because it's unmeasurable, right? Then, okay, well, someday maybe, but we ought to set that one aside. Now we can focus on ones where we're saying, oh, well, we're not doing it because, you know, our CRM would need an extra field. Okay, well, that's solvable or your average. I'm sure you've come up with a few other like technological solutions where you can get them over that hurdle. And now suddenly you can have access to that. So I feel like there are so many wonderful ideas and tips and pointers that have been shared throughout this conversation. If you had to pick one or two next steps that you would recommend an organization who doesn't have Alberta and Blueprint on call just yet, I would love to hear from each of you. What would be the one or two things you would say, try this now, besides eat your vegetables? That's a very hard question. I feel like kind of like sitting down and doing the what are you curious about, but also like well defining the terms that you're using it. You're like we want to increase quality of life. Okay, like what does that actually mean in your organization in measurable terms? We want people to feel proud. cool. Like what does pride mean to you? Like what is, what does that measurement look like? And it can be like a validated measure. It can be something that you kind of like make up so you can tell your story around like, this is how we measure pride and this is, this is how it increases over time, whatever, whatever works for you. So really kind of thinking about like as Erin said, like what are you curious about, but also making sure that you're being incredibly specific and everybody kind of agrees on like these terms that end up being like kind of thrown around. We're like, yeah, we all understand what quality of life means. It's like, do you though? What does that actually mean? And really thinking about that. And I think that actually uncovers a lot of potentially misalignments or places where people don't necessarily agree on those things. And that means you're kind of talking about all of these key terms that you're using in a very different way. I think that's like a very low hanging fruit that you can spend time doing in that really kind of helps you align across the organization and also figure out like, if these things are truly this important to you. how are you then gonna move forward in measuring them? Or are you collecting the data that is already kind of measuring that? And I think that's probably the kind of like lightest lift or something starting on this kind of journey of figuring out what your measurement journey, evaluation, evidence journey looks like, yeah. I love that idea of keep asking me what does this actually mean? How do we actually define this? Because I agree with you you may think it's really clear and that there's agreement and you will likely very quickly find out that not only is it not as clear as you thought it was or not as specific as you thought it was, but that there is a lot of variation in how people thought about that, how they would measure it, what they think that means. And so I agree, I think that's a wonderful place because I think everyone can do that. I think you can get to the point where you're saying, okay, we have very clearly defined the things that matter to us. and how we would know we were doing those things. And as you said, whether it's quality of life, whether it's pride in your employment, whatever that might be, you can get that to the point where it is something that, as you say, if you came across it on the road, how would you know it was that? How do you actually know? And there's sort of this onion game where you keep peeling back the layers until you actually get to that core, and it could be unique to you. It doesn't have to be what everyone else says. Or if you're stuck a little bit on how to define it, I love that you mentioned that other people have asked these questions too. And so maybe someone else has come up with a tool of how we can quantify pride in our career, a tool for measuring quality of life. And you can consider what domains in quality of life are important to you, specifically out of a tool that already exists. So that idea of like, you don't have to recreate the wheel. You can see what other resources exist out there is a great reminder. So Erin, how about you? Oh, I feel like Alberta's answer was so good. Now it's hard to follow that one. Um, I think that if you're going to start in on this journey, having like a real interest in it is important. So I think that, you know, once we had sort of our questions divine and everything, we really had to sit back and think when in my gut, what am I the most excited to figure out? Because that builds some buy-in, it builds some momentum and it builds some excitement because yes, it is important to, map out all of the data and those you can do that too. That's a very easy thing to do. What are we already collecting and how it is not a super thrilling exercise, but it's a very useful exercise. But once you've done that, you start, you know, you have to sort of do a bit of a gut check and figure out what is it that's like driving me to want to do this work? Because if you don't feel excited or interested in it, it's probably not going to continue. So that's sort of where I would go. My other thought would be to develop a, to put a little group together of people that have an interest in it. Maybe it's just a small group to start. That's what we did at the learning exchange. We call ourselves the dog. It's the data analysis working group. We just added the analysis in there. So it had a cool acronym. And, you know, it's, yeah, it's kind of funny and stuff, but really the whole point of it is that you have committed time together. with a small team to start talking about some of these bigger things before you bring it to the larger group, before you bring it or implement any kinds of changes, it sort of gives you that creative space to do it in. And those sessions are working sessions so they're not taking away, no one's doing anything extra outside of the session, with the exception of me because I'm leading them. it's just me and my little program, how could I possibly make any difference in any of the stuff that we're talking about? But actually to your point, that's where you should start. Start with the people who have that close lived experience with the data or lack thereof within your organization, who have that passion, that interest, and that actually is the most powerful place to start. And then from there, you can ask some of these great questions that Alberta is laying out and take those answers sort of then up and across the organization, and you'll be much better suited to actually have that be a success. learning endeavor. So, excellent. Well, thank you guys so much for your time today. This has just been such an extraordinary conversation. You guys are both doing such, such great work. If folks wanted to follow up with you, learn more about your two organizations, where could people go to be able to connect and find out more? That's a great question. Yeah, so you can find some more information, blueprint.ca, it's the one based in Toronto. There's a couple of different blueprints, so I'd like to qualify that. You can find a little bit more information about that. Also, if you go and you follow the mailing list or anything related to the Future Skills Center, a lot of the practitioner data initiative information will be coming out of that body instead of directly from Blueprint as well. Also, you can just find me on LinkedIn. Happy to hang out on LinkedIn if you want to find me at Alberta Johnson. That's the other option. The Learning Exchange, you can check us out online, sjle.org. And there's lots of information about our organization on there if you're looking to connect with me. I'm also on LinkedIn, Erin McKenney, M-A-C, just throwing that out there because my name has all of the extra letters. We have some social media presence and we're actually going to be starting to showcase more of our data on our social media. That's one of the next steps in our work. So you can follow us on Facebook or Instagram. You can check us out and you'll see some of the work that we're doing too. Excellent. Well, thank you guys so much for your time today. Thank you.

Other Episodes

Episode 36

September 15, 2022 00:44:34
Episode Cover

36: Data Ethics with Alexandra Robinson

Should there be a Hippocratic oath for data? A kind of fiduciary responsibility oath for analysts? We explore the ethics of responsible data collection,...

Listen

Episode 39

October 06, 2022 00:43:47
Episode Cover

39: Data Workforce with Claudia Juech

We can't amplify our good work with data if we don't have the right people to work on the data. Today's guest, Claudia Juech...

Listen

Episode 12

November 18, 2021 00:28:55
Episode Cover

12: Data-Driven Content with Treasa Edmond

What can you learn from a ghost writer about data? Well, when it comes to how to effectively leverage data in copy and content,...

Listen