< Back to all episodes
Beth Kanter and Sam Caplan
This episode of Impact Audio features Beth Kanter, trainer and author, on how nonprofits can responsibly adopt AI.
Beth Kanter, nonprofit trainer and author, and Sam Caplan explore the nuances of how nonprofits can start using AI today in a responsible way.
This episode features Beth Kanter, nonprofit trainer, facilitator, and author, on the rise of AI. She’s been training nonprofits on how to adapt to new technology for four decades and she’s learned a lot in that time.
In this episode, we cover:
Similarities and differences between AI and previous tech advancements.
How AI could address–or worsen–nonprofit burnout.
One question you should ask yourself to adopt AI responsibly.
Beth Kanter is the co-author of four books for nonprofit leaders, her most recent is about AI, "The Smart Nonprofit: Staying Human in an Automated World.” She received NTEN Lifetime Achievement Award winner and has many decades of experience working with nonprofit leaders. She has delivered training to nonprofits and keynote talks globally on topics such as workplace wellbeing, disruptive technologies & mission-driven work, and the use of AI for productivity. Learn more about Beth at www.bethkanter.org.
Sam Caplan is the Vice President of Social Impact at Submittable. Inspired by the amazing work performed by grantmakers of all stripes, at Submittable, Sam strives to help them achieve their missions through better, more effective software. Sam has served as founder of New Spark Strategy, Chief Information Officer at the Walton Family Foundation, and director of technology at the Walmart Foundation. He consults, advises, and writes prolifically on social impact technology, strategy, and innovation. Sam recently published a series of whitepapers with the Technology Association of Grantmakers titled “The Strategic Role of Technology in Philanthropy.”
Episode Notes:
Follow Beth Kanter on LinkedIn.
Read Beth’s Blog, in particular her reflection on the recent Microsoft's Global Nonprofit Leaders Summit.
Read Beth’s recap of a conversation with Devi Thomas from Microsoft Philanthropies and friend of the podcast.
Sign up for The Review from Sam Caplan.
Episode Transcript:
It's 1982 in Little Rock, Arkansas, I'm a skateboarding English major in a punk band with a poetry show on the local community radio station. And it's around that time that I get my first computer, an Atari 400.
The membrane keyboard is mushy and the memory comes in kilobytes. But this little machine changes my life. I became a techie.
Over the course of a year, I use my Atari 400 to teach myself the programming language basic. Eventually, I go on to land my first corporate job as a technical writer. And as I build my career, I work to keep up with all the latest changes in technology. For 30 years, through the rise of personal computers, the internet, smartphones, and social media, I feel things are moving fast and that I can keep up.
I build a career out of keeping up. And that leads me to today.
Let's talk about AI. Today, the rise of generative AI feels both familiar and alien to me. By my estimation, the potential of this technology is on par with the invention of the internet. It will change our lives in ways we can't comprehend just yet, which is disconcerting at times.
But what comforts me is that we've been here before. We've lived through the rise of the internet, social media, and a lot more. What feels different and exciting this time around is that we're going into this new future with our eyes wide open. We're asking the right questions and confronting challenges head on.
Welcome to Impact Audio. I'm Sam Caplan, vice-president of social impact at Submittable. In January 2024, I attended Microsoft's global nonprofit leaders summit in Bellevue, Washington. We'll have an episode later this month reflecting on my experience in greater detail.
But what rose above all the noise at the event was a core question. What does it mean to use this new technology responsibly? It's a tough question. But thanks to folks like Beth Kanter, we don't have to start from scratch.
Beth Kanter is an author, trainer, and speaker who has spent the last 40 years advising the nonprofit sector on how to responsibly adopt new technology. Her most recent book, which she co-authored with her long-time collaborator, Allison Fine is <i>The</i> <i>Smart</i> <i>Nonprofit:</i> <i>Staying</i> <i>Human-Centered</i> <i>in</i> <i>an</i> <i>Automated</i> <i>World.</i> In <i>The</i> <i>Smart</i> <i>Nonprofit,</i> Beth and Allison paint a clear picture of how nonprofits can use AI and automation to, as they put it, turn the page on an era of frantic busyness.
They explore ways to use this new technology to help foster what Beth calls a culture of workplace well-being. Beth also attended the global nonprofit leaders summit. Before we met up in Bellevue, I got her on a call to discuss the implications, risks, and potential of AI in the nonprofit sector.
Beth's journey to become a techie began in an unexpected place. Rather than an Atari 400, her entry into the nonprofit tech world started with the classical flute.
I started as a music major in classical flute. And when I discovered I wasn't going to sit first chair in the Boston symphony, I started getting interested in running nonprofit arts organizations. And my first job was with the Boston symphony in the fundraising department.
And in those days, the technology was the big wang with those big gigantic disks. But then they wheeled in an HP, the first personal desk computer. And everybody didn't want to touch it. But I was really interested.
And I pulled it out. And I used lotus spreadsheets. And I found, wow, I can be more productive. So that's what drew me into technology.
Once Beth got started in tech for nonprofits, she was all in. She started one of the first nonprofit blogs, now called Beth's blog, documenting trends and new technology in the nonprofit sector.
The more she wrote, the more she had to say. So in 2010, she started writing books coaching nonprofits on how to adapt to changes in technology.
My first book was with Allison Fine, it was called <i>The</i> <i>Network</i> <i>Non-profit.</i> So the first book was really about the power of networks and social media to help nonprofits reach their outcomes. We co-wrote that together. And at that point, it was very early and people were scared or just didn't want anything to do with it. And so it was to introduce leaders to the power of this technology.
My second book, also about social media, co-authored with Katy Payne was about measurement. Because at that time, there was no way, no metrics, no agreement about what you could measure around social media. In fact, people thought it was just--
you couldn't measure it, so it was a waste. So that was a deep dive into social media measurement, and what leaders need to think, and what nonprofits need to do in terms of measurement.
My third book came out of seeing a lot of people in the sector, and this is 10 years ago, burning out and not taking care of themselves. And also having cultures where it wasn't a culture of well-being, it was more a culture of let's see how we can burn people out. And nobody wanted to talk about it.
In fact, when I first started talking about it, I was told it was a bunch of hippie crap. So I published a book long before the pandemic made well-being a popular topic. So I'm really glad to see more and more people talking about it.
And my fourth book, <i>The</i> <i>Smart</i> <i>Nonprofit:</i> <i>Staying</i> <i>Human-Centered</i> <i>in</i> <i>an</i> <i>Automated</i> <i>World</i> was co-authored with Allison Fine was a result of our seven or eight years of research and writing about the impact of nonprofits and artificial intelligence. Now, there's many, many experts out there talking about this in the last year because of the rise of generated AI.
One through line and much of Beth's work is the concept of a culture of well-being in the workplace. Beth defines a culture of well-being as a workplace culture that prioritizes the mental, physical, and emotional health of its workers. She raises this concept as an antidote to nonprofit burnout. And this burnout shapes how she thinks and writes about how nonprofits do everything from AI into empathy and leadership.
In her book, <i>The</i> <i>Happy,</i> <i>Healthy</i> <i>Nonprofit</i> co-authored with Aliza Sherman, Beth argues nonprofit burnout is not the fault of the individual nonprofit professional. Instead, it's the result of a workplace's culture. And it's on nonprofit leaders to create a healthy workplace culture through the five factors; function, friendship, feelings, forward, and fulfillment.
Functioning, does everybody have what they need to do the work?
And when we're talking about hybrid and remote environments, do people have access to the tools? Do people have the tools at home? Do they have internet access? Do they have someone come and help them about having an ergonomically correct workspace?
We have friendship, that's the second F. And that's about feeling connected and a sense of community and something that you have to work really hard at in a hybrid area. But this is about really making opportunities for people to get to know each other as humans.
And I'm not saying that everybody has to be everybody's best friend all the time and go out for beers all the time after work. But it's not just about work, it's also about relationships. So people need to build those relationships. And that's the elixir of productivity.
The third F is feelings. And that's the feelings of appreciation, which lead to a good positive morale, which also helps with productivity. The fourth F is forward. And this is all about professional development and professional growth, having opportunities for people to be learning, and growing, and acquiring new skills, and advancing in their jobs.
The fifth area is fulfillment. And this is being connected to the mission. And I think that area we do pretty good in. The others, I think, we could--
is where a lot of nonprofits need to really build that.
To me, this need for a culture of well-being in the face of pervasive burnout is a useful way to understand the mindset of the nonprofit sector going into generative AI's rise. From my perspective, nonprofit folks seem to still be exhausted. And that exhaustion leads to cynicism, burnout, and ultimately turnover.
Depending on your perspective, AI has the potential to alleviate this exhaustion or make it much, much worse. Before we dive into the role of generative AI in addressing or perpetuating this problem, I wanted to hear Beth's thoughts on the state of nonprofit burnout today. So I asked her, has there been any progress in alleviating nonprofit burnout since she and Eliza wrote <i>The</i> <i>Happy,</i> <i>Healthy</i> <i>Nonprofit?</i>
Let me say, yes and no.
I wish there was more. I think the pandemic really accelerated the burnout that was already existing for a lot of reasons, scarcity mindsets, being short staffed. So it's this feeling that in our sector, we have so much of a sense of duty of serving our clients, a sense of duty to the mission that will sacrifice our own mental and physical health. And when I'm saying, even with social justice issues and things that have been there for decades, centuries, if you will, we're not going to--
they're not going to be not solved if we take some time off to refuel. And we need to do that.
Now in the pandemic, as you know, burnout rates soared. And I saw a study that said burnout is affecting more than half of all nonprofit employees. And the statistic that stood out for me was 60% of nonprofit leaders feel reported feeling used up at the end of the workday.
And I have to say that even before the pandemic, the World Health Organization declared burnout as an organizational issue, that it's not just an employee responsibility or the fault of the employee but really overall, it's because of certain organizational culture issues. And the things that cause burnout are unmanageable workload and unreasonable time pressure, a lack of role clarity and autonomy, a lack of support and communication from managers or executive directors, unfair treatment of work, including microaggressions, and things like that, and then not encouraging individual work-life balance and self-care. And that comes with a cost.
This is Gallup information. 63% of people are more likely to take a sick day. 2.6
times are more likely to be actively seeking a different job. And if you've had to fill a position recently, you know that that's painful. And 13% reduction in performance goals. So this is an organizational responsibility.
And just focusing on wellness, kale smoothies, and massages, that doesn't do it. You need an organizational approach where leaders are involved. And it needs to look at toxic pieces that are in your culture and turn them around.
Now, some organizations are doing this through providing flexible work that hybrid offers us.
Leaders are recognizing that there's a great loss when we have to hire people. And people leave because they're burned out. And they get burned out because of the culture. So I see a lot of movement now beyond just the lip service to really change the cultures.
So we know there remains a pervasive problem with burnout in the nonprofit sector. And a promising solution is a culture of well-being enacted through Beth's five factors at the management level. With burnout as a core part of the backdrop, how are nonprofit professionals reacting to the rise of generative AI?
Beth has observed two broad categories of responses; ostriches and early adopters. And these categories aren't new. According to Beth, with the advent of every new phase of technological advancement, these are the two groups that always emerge, whether it's in reaction to the internet in the 90s, social media in the 2010s, or generative AI in the 2020s.
What I've observed over the last year and it's shifting, but there's been basically a binary approach. On one side, there was the ostrich approach. Put your head in the sand, I'm not going to have anything to do with this.
And that's not new. I've seen that through every disruptive technology. And that's around outsized fears that the robots are going to kill us or they're going to kill our jobs or just being turned off by the whole thing. So because of that, I'm staying away from it.
On the other hand, I've seen the early adopters and the techno lovers just jump in and let's see if we can leverage and get the benefits of early adopters. I've seen a lot of mistakes. So there's been this binary approach. And I think that we can't have that with this technology, that we need--
the toothpaste is out of the tube. We need to start adopting it. But we need to do it with thoughtful preparation, which includes thinking around the ethics and safety issues and how to do this strategically, responsibly in a way that is very human-centered.
And that's the heart of it. What does a human-centered approach to AI mean?
To answer that, we also have to ask, what parts of nonprofit work are uniquely irreplaceably human? And what parts contribute to the pervasive burnout and exhaustion in the sector?
According to Beth, a good start to answering questions like these is to be very intentional about how you use the time AI saves you. She refers to this phenomenon as a dividend of time, which is to say, at a fundamental level, AI could give you time by automating grunt work. How you reinvest the time AI gives you will have a huge impact on whether or not you're embracing AI responsibly.
If you're able to spend that time on fulfilling work, then you're using AI to create a culture of well-being. Recall that fulfillment is Beth's fifth, perhaps, most important factor to developing a healthy workplace culture.
AI is going to have a profound impact on the way we do our work. And it's going to automate a lot of grunt work, a lot of what I call spreadsheet aerobics, cut and paste.
And it's also going to help free up time, maybe, to address things like the donor retention rate or help us think more creatively about strategy and innovation and have more impactful program delivery. What I get concerned about is whether or not we have this choice.
So we can use this to do more work more efficiently and become busier or we can actually leverage this dividend of time to really address some of these really important issues and reinvent the next chapter of nonprofit work. And that's why it's not so much a technical issue of let's figure out the best prompt to write a thank you note in our style, but it is to rethink how we're going to invest that dividend at a time. I mean, at a really minimal level, if we really invest the dividend in time into our organization's well-being, that we would think about that freed up time and letting people have that freed up time to think, to plan, maybe shift to a four-day workweek.
If leaders are willing to get rid of their productivity paranoia, which is this misalignment between not actually seeing people at the screen or typing or being constantly busy, if it's face-to-face office and having that in their mental model as the concept of productivity and really shifting that to outcomes.
This productivity paranoia is a problem. But there's an even deeper anxiety, replacement, which is to say replacing humans in the workplace with AI and automation. This anxiety is the reason ostriches stick their head in the sand. It's also the reason why Beth and Allison Fine wrote their book, <i>The</i> <i>Smart</i> <i>Nonprofit</i> in 2022. In it, Beth says the concern of the ostriches aren't unfounded.
That's why we wrote the book. And that's the whole thing about responsible use, that it's not a cheap replacement for staff. And it's not something where you can cut and paste the output from it without human judgment and editing.
It's not a magic fairy dust where you push a button and it does it. You have to learn how to work with it. And it's a different way of working. And the metaphor, the word I like to use is co-bonding. And I think Microsoft uses co-piloting.
So with humans always in charge with the last word, but it's, what is your degree of collaboration or co-bonding with the tool? Is it minimal? I'm just going to use it to do some copy-editing because that's what I need.
Or is it going to be somewhere moderate where it may generate some ideas? Maybe it gives you a couple of sentences here and there. Or will it be maximal and generate the first draft on which you edit and put your voice in?
So I think that co-bonding use cases only. And we really have to learn that the human-centered side of that. Because we have to be in charge. It's our human creativity, our judgment, the thing that makes us human that we have to retain. And that's the thing that people are scared about. I hear that.
I just came back from doing workshops in Ventura. Did a keynote at the crossroads, human-centered leadership and generative AI. And then I did a workshop on leading with empathy, which had nothing to do with the technology because we also need to keep our human skills and retain them.
To address the concern of ostriches and help them catch up to early adopters, Beth advocates for a cohort-based approach to adopting AI. And funders have a role in making these cohorts a possibility.
Well, I have to say that aside from technology, my lens has always been as a trainer and facilitator. I've been doing cohort groups in adopting technology and new practices for the last 25 years. And I have started and have been working on a lot of cohort projects around the adoption of AI.
And some of that's beginning with developing your ethical and responsible use framework and policy and an acceptable use policy for AI. Making the time to do that, not just cut and paste somebody else's. Because people in nonprofits need to talk about this. And to do this as a cohort, so people are learning from one another to be able to develop their prompt playbook and to be able to experiment and iterate and shift into this workflow.
And if they don't have the dedicated time in something like a cohort process, they're doing it on the fly. And it's like little experiment here, experiment here. And what a cohort training process, especially one that a funder has invested in, it gives the curriculum and the time to do it and to learn and to experiment and to integrate that into the organization's work culture.
The past three decades have taught me two important lessons. One is that skateboarding is never going out of style. And the second, no one knows exactly how new technology like generative AI is going to mature. We're in the early stages of generative AI.
If 2023 was the year of shock at its sheer power, 2024 will be the year that we start harnessing that power for good. We've been here before with other technological advancements. And there's going to be twists and turns, embarrassing mistakes, and world changing innovations.
Making sure we use this technology responsibly starts with how we use our time. Let's use this technology to make time for ourselves. And let's use this time to think deeply and humanely about the role that AI is going to play in the rest of our lives. I think Beth articulated it well when she described what AI provides us as a dividend of time. Thinking about it this way forces us to confront how we're spending the time that we're given. So how will you spend your time?
Earlier this year, both Beth and I were part of a large cohort of nonprofit professionals at Microsoft's global nonprofit leaders summit. Later this month, I'll reflect on what I learned there along with my colleague Sam Ellsworth. So subscribe to Impact Audio if you haven't already to hear all about it.
You can also subscribe to my newsletter, the review@submittable.com/newsletter.
I also encourage you to check out Beth's blog at bethkanter.org. You'll find her own reflections on the Microsoft event there as well as collaborations with our friend David Thomas, global leader at nonprofit community capacity at Microsoft Philanthropies. That's all for me today. Thank you for listening to Impact Audio produced by your friends here at Submittable.
[MUSIC PLAYING]
PLAYING]
Season 2 , Episode 6| 31 Min
Leon Wilson
Sam Caplan
Season 2 , Episode 9| 35 Min
Storme Gray
Laura Steele
Rachel Mindell