Dr. Margaret Heffernan is a 5-time CEO, BBC program producer, and author of 6 books. Her book, Willful Blindness: Why We Ignore the Obvious at our Peril, was named one of the most important business books of the decade by the Financial Times. Her 3 TED talks have been watched by more than 20 million people.
Author of Willful Blindness
As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”
Satyen Sangani (00:03): In 1968, social psychologists John M. Darley and Bibb Latané introduced the world to a concept called the Bystander Effect. Through a series of experiments, the two scientists demonstrated that people were less likely to intervene during an emergency if there were multiple witnesses. Each appeared to rely on the other to act.
But they were confused by stories they heard of elephants. An elephant’s ferocious power could destroy forests and villages.
For Data Radicals, this is a fascinating paradox. Why don’t people speak out when the data demands that they ought to? Do they assume other people will, or are they scared of retribution? This paradox is one that has been on my mind recently, as we’ve seen stories of corporate malfeasance at Theranos and Facebook now known as Meta. At both of these companies, many chose to be bystanders. Others, whose names we don’t know, probably pushed for change internally. And a few became whistleblowers, such as Frances Haugen of Facebook and Tyler Shultz and Erika Cheung of Theranos .
(00:57) Why don’t more people speak up when they see trouble? How can employees intervene when they do see trouble? And how can organizations create best practices that ensure people will feel empowered to speak out?
Today, we’re going to dive into just these questions. Our guest, Dr. Margaret Heffernan, wrote extensively on this topic in her book, Willful Blindness: Why We Ignore the Obvious At Our Peril. Her newest book is Uncharted: How to Navigate the Future. She’s a writer and a speaker whose TED talks have been seen by over 13 million people including me. So let’s get into it.
Producer Read (01:37): Welcome to Data Radicals, a show about the people who use data to see things that nobody else can. This episode features an interview with Dr. Margaret Heffernan, author of Uncharted: How to Navigate the Future. In this episode, Margaret and Satyen discuss willful blindness, common characteristics of whistleblowers, how organizations can create safer environments for their employees to point out wrongdoings, and much more.This podcast is brought to you by Alation. Catch us at Snowflake Summit this summer. Hear how people use Alation with Snowflake to uncover insights and business opportunities and data, innovate ahead of the curve and build data products that were unthinkable just two years ago. Snowflake Summit runs from June 13th to the 16th. Attend virtually or in person in Las Vegas. We can’t wait to see you there. Learn more at snowflake.com/summit.
Satyen Sangani (02:32): In Willful Blindness, Margaret tells the story of Alice Stewart, a doctor whose research was ignored by the establishment. It’s a fascinating story that perfectly illustrates a whistleblower’s challenge.
Dr. Margaret Heffernan (02:43): Alice Stewart was one of the very earliest epidemiologists in the UK. She trained as a doctor. This was in the late 1930s, early 1940s. So she trained as a doctor. This was very unusual, obviously, for a woman. She passed her entrance exams for the Royal College of Physicians at an earlier age than any woman who’d ever done. And she became very interested in this new field of epidemiology and had the opportunity to start exploring it as people were asking a lot of epidemiological questions in the course of the Second World War. And like any good scientist, what she knew she needed to do was find a hard problem and solve it. And the one she found was to do with the rising rates of child cancers, which was a strange case because most diseases correlated with poverty.
(03:39) And in this case, the kids who were getting cancer seemed to be coming specifically from affluent families. So this kind of anomaly potentially is full of useful information. Like many scientists then and now, she found it difficult to get her research funded, but she sort of scraped stuff together. Of course, it’s before the internet. So doing a huge study — I think this was 500 families with kids who had cancer and 500 who did not — was all carbon-copied forms that she persuaded the health service to do the work in filling in. And she’s kind of amazed when the results came back through the mail, as it did, in those days.
(04:24) And by a rate of 3 to 1, all the kids who had cancer had mothers who’d been X-rayed when they were pregnant. Now that kind of statistical clarity is very rare in science. So this is a big moment. And she rushes to publish her preliminary findings. In the long set, everybody gets very excited. People start talking about her Nobel Prize and she’s instantly commissioned to do a very similar story in Scotland. Now she thinks, “I must rush and catch all these cases, because obviously we’re going to stop X-raying pregnant women. And then the deeper knowledge in here we could lose. So I really better get my skates on.” In fact, she didn’t need to rush, because it was 25 years before the medical establishment decided that X-raying pregnant women was a bad idea.
Satyen Sangani (05:12): So if the data was clear, why did it take 25 years for the medical community to listen?
Dr. Margaret Heffernan (05:18): The quick conclusion people reached for, as to why people ignored this data, is that she was a woman, so they trivialized and marginalized her. And that’s a really comforting excuse, but it doesn’t hold water because when her results came out, research at the Harvard School of Public Health instantly replicated her study in the U.S. Much bigger study, very similar results. And he was just as overlooked as well. So this was a case of what I called “willful blindness” in my book of the same title. And so I was really interested in, okay, so what’s going on? Why are the doctors ignoring this? And there are quite a lot of reasons. The first reason is, the X-ray was a very cool technology. It was solving all sorts of phenomenal problems, answering lots of phenomenal questions. And like all cool technology, people thought this was the answer to everything.
(06:18) So they were using it to find pieces of jewelry that had been baked into cakes. They were using it to X-ray feet in shoes to make sure that the fit was perfect. They were using it for everything. And they didn’t like the idea that this wasn’t the miracle that it purported to be, that it might have problems. So that’s part of it. Also, doctors really do not like anything that suggests they are less than 100 percent humanitarian and saintly in their dedication. And this suggested they might be human. So that really undermined their sense of professional entitlement to respect. So both of those things I think are germane, but the real thing was at the time, the prevailing theory of disease was that everything’s healthy up to a point of threshold,only after which it becomes dangerous.
(07:17) So this is something that was called Threshold Theory. And pretty much everybody bought into it. And confronted by Alice’s data, they had to choose between the theory or the data. And these most scientifically minded of people chose the theory. Now this, in my book, leads to an extended argument about theories and mental models which we create and need, and cleave to. And we couldn’t function in life if we didn’t have mental models of how the world works, if epidemiologists didn’t have a mental model of how disease works.
(08:00) But the difficulty is that they are so useful to us that we tend to ignore the data that doesn’t fit. And to his credit, I think the economist Paul Krugman once made this beautiful casual throwaway line where he said this about economic models. He said, “I sometimes wonder if the data that didn’t make it into my models wasn’t more important than the data that did.” And what this does is it really illustrates the degree to which as hopeful as our mental models are, they’re also a problem, which is they will marginalize, trivialize, and repel what doesn’t fit. And that’s how we end up making these gigantic errors.
Satyen Sangani (08:48): So 25 years is an incredibly long time. The impact of the data is that you are seeing children with cancer. Is there a more obvious story that somebody could point to? But even in that case, it took 25 years. Now the listeners on this podcast are people who within their organizations are both trying to produce more data. But I think there’s also generally a sense that within companies, within organizations, within the world, if the truth is there, it’ll set you free and people will automatically react. And I guess, what I think would be really so helpful is to just understand your learnings from how does one avoid — if they’re in this circumstance of being a truth teller — how do you avoid the circumstance where you have to wait 25 years for everybody else to listen to you and to catch on?
Dr. Margaret Heffernan (09:38): Sadly, the truth doesn’t set us free. And disruptive truths often have to wait a long time to be accepted. I think if you start from the premise that the data alone won’t win the argument, that’s really useful. Because I think if you look at the greatest communications failure probably in human history — which is our failure to understand climate change — I think what you see there is at least 35 years of thinking the data speaks for itself. It’s very clear it doesn’t speak for itself. And usually what happens, almost predictably what happens, is people confronted by data they don’t like ask for more data. I see this in boardrooms, for example, all the time. And so that often doesn’t make it any better. It just makes it worse. So this is a way of saying, “I don’t understand.”
(10:39) It’s a way of not having the argument. It’s a way of postponing the argument. So that’s what not to do. So what do you do? I think you have to start thinking about what does this mean for the things that people care about. So, if I really want to talk to people about why they should stop X-raying pregnant women, I need to start talking to their desire to do their utmost to bring really healthy children into the world. And I probably, if I think about the medical establishment in Britain in the 1950s, or as I imagine it to have been, I probably would do well to have talked to the parents rather than the doctors.
(11:31) Because parents are pretty unequivocal about their love for their children. That would’ve been probably a softer landing ground for my message than the medical establishment which had all sorts of different defenses up. So you have to kind of get under the data to think, what does this mean about the things that people care about most deeply? And how can we connect there? And not through frightening people. And not through blaming people. But thinking what you want in life — healthy children, healthy communities — is being threatened by this thing. So how can we protect what you love? It’s a whole different conversation.
Satyen Sangani (12:27): Besides obviously catering the message, and certainly by talking to the impacted stakeholders, any other lessons that you’ve seen that have worked, or any other examples that you’ve seen of individuals who have done a particularly poor job or a particularly bad job or a good job?
Dr. Margaret Heffernan (12:43): I think you have to meet people where they are. I think you have to listen and understand what they care about and then think: what does my data mean to them? So I think this translation from the data into a narrative is a really crucial stage. And I think mostly scientists jump over it. Now, let me be really fair. I think we are living through an age, a kind of golden age of scientific writing. I think there’s a great deal of cultural translation going on. Wonderful writers who are able to talk about this in a really accessible way. But the very best of them, it seems to me, are not trying to kind of ram the data down people’s throats. They’re really trying to speak to what it is that people care about and make that connection. And I think what is certainly what I encounter with people in all walks of life, whether it’s scientists trying to get their message across or leaders trying to get their message across, is they find it gruesomely difficult not to see the story through their own eyes.
Not to think, “I know, you don’t, you need to accept my truth” You have to be able to accept other people’s truth before you know how to make the two talk to each other. And it’s not about compromising it, but it’s about going deep enough to find where the two true truths intersect. And I think we’ve generally been poor at that. It’s definitely not part of scientific training, but I also think, to cut people some slack here, I also think a lot of scientists have been feeling this is really urgent and they’ve moved too fast and they’ve left people behind. And this is a real tension, which is that actually, if you don’t invest the time in really creating a community of trust, then in your speed you’ve made your job longer.
Satyen Sangani (08:55): Which I think we’ve seen with the pandemic in so many examples. Certainly, in the United States, there’s one side of the policy setting community which consists of people who are trying to optimize for people’s lives, and who are trying to optimize for public safety as defined by number of hospitalizations and casualties. And on the other side, you’ve got folks that are trying to optimize for the economics, and those are two very different perspectives. And obviously, lots of value judgments are implicit in all of that, but really difficult to have those conversations. And I think data people would like to think that they have the intellectual flexibility because they have the data, but there often comes with that a little bit of indignance, maybe a little bit of pridefulness that they have the upper ground. And I think that’s something you have to guard against often. That sort of high road, I think.
Dr. Margaret Heffernan (15:50): So I think this is really interesting because a couple years ago — so my data will be somewhat out of date I expect — but I did some work with Gartner and they shared with me data they had to show that the longevity of chief analytics officers or chief data officers in companies was running at approximately 16 months. In other words, someone would get appointed, they’d be in the job for 16 months, and then they’d just leave. Why? Basically, they would rock up and dig into the data and find all sorts of stuff and start telling people stuff. They would often do that before getting to know people, getting to know the culture, getting to understand the strategy, integrating into the business. At the same time, the non-data people would feel quite threatened often by this appointment and think, “It’s a data person. I can’t stand that. I don’t really understand it. So I’m going to keep my distance.”
Dr. Margaret Heffernan (17:01): And so the data people didn’t understand context and strategy. And the strategy people didn’t know how to frame good data questions. So they would routinely have these experiences where the chief data officer would come with all sorts of insights drawn from the data. They wouldn’t have any natural allies and they wouldn’t necessarily tie the data to the strategy. And so nothing would happen. And very often, of course, what senior leadership teams or boards would do with all spots for more data, because that’s how you don’t say no. And so eventually, they’d get fed up and frustrated and leave. So this is really a failure to find a common language. And the alternative is they would sometimes just say, “Yes, you must be right.”
(17:56) Which is not actually an interesting response for a data scientist. The data scientist wants to know what are the next really interesting questions to ask so we understand more? So this kind of passive acceptance isn’t a great accolade either. And so they didn’t feel that they were really being useful. So I think that’s a kind of classic description of what often happens, which is either the data is over-worshiped or under-worshiped, but people, I think often non-data scientists, are very poor in understanding how to respond to it. What does it mean? So nevermind what the data says, what does it mean? And that’s the crucial missing link for the data to matter to people.
Satyen Sangani (18:48): It’s interesting because this concept of both having to cater the message, but also having to interpret the message just seems like there’s a fundamental block. And you talk a lot about empathy in your work in order to be able to have people really see the other side of a discussion. But one of the interesting things that I’ve noticed is that you had these stories, one of them is, of course, of Alice Stewart. Another one of them is for this woman by the name of Gayla Benefield. And more recently in Facebook, for example, we saw this woman by the name of Frances Haugen, who got up and talked about all of the impacts of Facebook on teenagers.
Satyen Sangani (19:28): And then another great story in the media recently is this woman by the name of Erika Cheung, who along with Tyler Shultz came along and exposed Theranos. And you think about all these women, and you write a lot about women who are these whistleblowers. Tell us a little bit about this phenomena. Because I actually think it’s a little interesting and I’ve seen, in my travels, lots of women in data. So I’d love to just get your perspective on why are these whistleblowers who you’ve talked about as a role, often women? Am I seeing data that doesn’t actually exist? And I love your feedback on that, because it’s such an interesting topic.
Dr. Margaret Heffernan (19:59): So you are seeing data that doesn’t exist. There is a perception that women are, that whistleblowers are more likely to be women. That data does not support that. It used to be thought that because they were slightly marginalized, they might therefore have a different perspective. But gigantic amounts of research into whistleblowers doesn’t really agree. The thing that’s really interesting about whistleblowers is that they’re deeply ordinary. What distinguishes them is they tend to have come to their work with, if anything, slightly higher levels of engagement and commitment, and therefore expectation. So they’ve joined the company really wanting to do good work. And what upsets them is discovering that they can’t. And this is true of male and female whistleblowers that I’ve interviewed. They get upset because they thought, “I thought this was a great company. I thought we were going to do great stuff. I thought we were good people”
(21:09) And that’s what makes them really angry. They’re not cynical to start with. And what’s intriguing is how often they will try, first of all, to make internal change. And it’s only really when they’re blown off that they then think, “Right, this is so bad, I have to go outside.” There are a couple of interesting things here. One is, it frequently happens that they try to change things and succeed and those stories almost never get written. Because nobody wants to write about the bad stuff, especially once it’s fixed. But they happen a lot. It’s a kind of invisible narrative. The other thing that I think is really telling I went, I wrote two plays about Enron for the BBC. And as part of my research, I went and talked to Sherron Watkins, who is known as the Enron whistleblower.
(22:08) Now, Sherron is a classic whistleblower. She absolutely loved working for Enron. It had given her tremendous social mobility, very committed. And when she went to work in treasury and started seeing all these terrifying things, her only thought in writing to the chairman was he’ll fix it. He’s a good guy and he’ll fix it. She never imagined that as soon as he talked to her, he’d pick up the phone and see, could he get her fired? But she never went public. Her letter was found because of the DOJ investigations. And when I said to Sherron, why her? Because we talked a lot about her colleagues who had kind of known and done nothing. And she told me this great story, which is at one point she and the WorldCom whistleblower, and the FBI whistleblower were all the kind of people of the year on Time magazine. And they got together for the photoshoot and they were asking themselves this same question.
(23:14) And the only thing they could find that they had in common was that they’d grown up in small communities of less than 10,000 people. And what Sherron said to me was in towns of that size, you don’t just walk on by if a tree’s falling down, or if a kid is hurt, or you see something wrong. You do something about it because it’s your town. And I thought that was a really interesting insight. And I think there’s a question, especially when I think of corporations: How far do people feel it’s their company that they need to fix it? And how do they feel? How much do they feel, “It’s not my problem, it’s not my department, it’s my boss’s problem”?
Satyen Sangani (24:09): Whistleblowing is an act of last resort. How can companies create an environment where people are comfortable to speak up? But then, of course, there’s this question of what’s the first step? I guess that could be different for people, but how do you think about that and how do you advise people when they ask you.
Dr. Margaret Heffernan (24:17): I spent a lot of my working life these days working with companies to help them figure out how to do that. And working with individuals because there’s been a lot of talk about needing to have an environment of psychological safety. And I think it’s an important idea, but I don’t think we’ve taken it nearly far enough because a great deal of the danger of a working environment isn’t to do with the company itself, it’s also huge external factors. In a very fragile economy, people are just not going to feel safe. If they’ve come from a very difficult family background, they’re not going to feel safe. So actually it becomes very important to give people tools to figure out: How do I raise an issue, which I think is important, and do it in a way that is safe? And I’m astounded as I do this work by how little people know about how to do it.
(25:16) I often think a lot of the stuff I write about is blindingly obvious because I think I forget that once upon a time, I didn’t know why they’re right. But I think, increasingly, I think it’s just sort of a central part of a grown-up education to know when there’s something I’m really bothered about, how do I raise it? And typically, what I find is people think, I have two options: I can stand on the table and scream about it or I can shut up. And the truth is there is a world of possibilities between those two extremes. But mostly people are too afraid to explore them. But there are lots of things you can do. Now, I’m not saying every single time there’s a magic answer, but I am rather like my client who learned how to deal with his issue. I am persuaded there are more ways to raise issues than most people have any training to do.
(26:15) And I think we have to put that power into people’s hands because the whole premise of organizational life is that groups of people know more than individuals working alone. But if we can’t have these difficult, tricky, sensitive conversations, then we’re losing a vast amount of information. And the longer everyone goes on not talking about Volkswagen’s emissions, or not talking about Wells Fargo’s mis-selling, or not talking about the fact that the Theranos tests don’t work, the longer the harm continues. Which means the bigger it gets, which means the more life threatening it becomes to the organization itself. So actually there’s a huge upside if we could actually teach people how to raise these issues early when they’re much easier to address.
Satyen Sangani(27:07): We all have heard the story of the emperor having no clothes. And yet it feels like in a lot of cases, people just simply acknowledge that the emperor has a lot of clothes and move along with them every day.
Dr. Margaret Heffernan (27:17): Yeah.
Satyen Sangani (27:18): What are the tips and tricks that you would recommend? If the bookends are saying absolutely nothing and standing like a statue and screaming on the table and gesticulating wildly, what’s the middle path?
Dr. Margaret Heffernan (27:31): I think the first is, check your facts. You might be wrong. Because I’ve certainly known many situations where people thought it was one thing and actually it was another. So that’s crucial, obviously. The other thing is to find allies. Do you know people in the organization who see the same thing you think you see and care about it? That’s always going to be safer. And then think about who in the organization might care. So can we find more powerful allies? And then figure out, how does that person communicate? So are they going to be better off if we write to them or we talk to them? Or how do we share this information in a way that makes it as accessible to them — and which is presented as we think there’s a problem, and we’d like your views on it; is this the case? And if so, what can we do about it?
(28:30) So present it as a form of good citizenship, which it is. That does not guarantee that you’ll have a happy ending. But saying nothing guarantees that nothing will happen. I think that’s loosely what you have to do. But what it does require is quite a lot of perspective taking, which is: Who’s involved? Who’s getting hurt? And what does it look like to other people? And how can we present this to them in a way that articulates our concern and goodwill? And how do we assume goodwill on their part? Because you can easily get dragged into this spiral of “This is terrible and these people are just being sloppy and they don’t care.” And it all becomes highly binary really fast. If you assume goodwill and you act like a good citizen, as I say, there’s no guarantee, but it’s much more likely you’ll be listened to. And much more likely either that you’ll learn something you didn’t know, or — and I have seen this happen — or that you actually get promoted: “Thank you. We needed to know that.”
Satyen Sangani (29:46): Who your organization hires is also incredibly important. If you want to build a culture where your employees feel more connected to your organization, then make sure you’re not hiring super chickens. Not following? I’ll let Margaret explain.Dr. Margaret Heffernan
(30:00): This is really about an experiment that was done by a scientist at Purdue University, who was really interested in productivity. But unlike the rest of us, or most of us, he wasn’t thinking about productivity in people. He was thinking of productivity in chickens. He was an evolutionary biologist and he wanted to understand what made chicken particularly productive. So he conducted this very beautiful experiment. Chickens are very social animals. So he went through his lab/farmyard and he identified the most productive flocks. And he simply put those to one side really to watch what they did and how they behaved. He did that for six generations, but then he added a sort of different model of productivity and he identified the individually most productive chickens, so you might call these super chickens. And he put them together into a super flock.
(30:58) So six generations passed. He comes back to see, how are these two different flocks doing? The first flock was doing brilliantly. It’s more productive than ever. All the chickens are really healthy and they’re producing more eggs than they ever did before. Fantastic. The super flock is a completely different story because all but three are dead. And he concluded that in that flock, the productivity of the few had been achieved by suppressing the productivity of the rest. Now, the reason I love this story is I think we’ve run an awful lot of organizations on the super chicken model. If you think of GE’s magic medicine — and you glorify and worship the top 10 percent. And what did you find it did?
(32:01): It did two things. It made people at the top super fractious and uncollaborative because obviously, “If I help you, Satyen, and you do better, then I fall down in the rankings. Why would I do that? No way.” So it made the top bit really fractious. But you don’t have to be a mathematical genius to figure it out. In this bell curve, the safest place to be is the big fat middle. So be average. Really work at being average because then you’re not going to be threatened and you’re not going to be glorified. You’re just going to have an easy life. And many people have used that as the explanation for the kind of incredible doldrums that Microsoft went through in the late ’90s, early ’00s, where they ran a pretty vicious form of forced ranking, led by Steve Ballmer, a hyper-competitive individual. And what happened is huge amounts of infighting at the top. And lots of people in the middle, just sitting there and kind of waiting for the dust to settle to figure out who won, who lost,and what are we doing today?
Satyen Sangani (33:15): Yeah. And that work that you did was so impactful on me for building the culture that we built at Alation, because countless numbers of sales executives as you can imagine in Silicon Valley, who have, as we’ve built the company over 10 years, have come in and said, “Look, I just need to pay my best performers the best. And I need all of my money to go to those individuals. And by the way, we need to make sure that those top people are celebrated and we’ve got to make sure that they’re taken care of.”
Satyen Sangani (33:47): And my answer is always like, “But what are you doing with the other 80 percent? How does the work get done? And interestingly, what I found is that people who prioritize particularly through compensation, culture, and company and performance, and really view as a team sport often will produce a much better outcome, irrespective. They can be very talented, but the ones that need sort of the best pay, they need to be at the very top and need to optimize every single career decision end up being the ones that you want to work with the least.
Dr. Margaret Heffernan (34:22): It’s incredibly disruptive. And I remember when I was working in the BBC, I had a young trainee — I didn’t know about super chickens at the time and boy, he was one. And I oversaw his first filmmaking, which was a catastrophe because he was so determined he was going to do everything and he knew best. That although he is surrounded by super talented people, eventually they just stopped helping him. Which, given he’d never made a film before, was quite a statement. And I remember sitting down with him afterward, when we’d had to reshoot the film and recut the film and explaining where I thought he’d gone adrift. And he turned to me and he said, “Well, Margaret, maybe one day when I’m as old as you are I’ll be as nice as you are.” At which point I thought, “Well, f**k off.”.
(35:14) But anyway, flash forward 20 years and I’m on the board of a TV company, and we’re having this conversation about this project which is in dire straits, and it’s really expensive, and it’s very political. And I say, who’s running this project? And it’s this guy. And clearly his whole career was spent bouncing from company to company, where he clearly dazzled people, never learned to work with people and got shunted off. And so I think, personally, I’m not persuaded super chickens can change. There must be some who can. But I think you are much better off investing your management effort in getting the whole thing to work.
Satyen Sangani (36:09): Creating a data culture begins with encouraging data literacy and fluency. Enabling visibility and transparency is also key, but it doesn’t end there. Instead, I encourage all Data Radicals to think about how they can empower everyone in their organizations to be agents of change. Determine how your employees can feel more engaged and connected to your organization and its aspirations. Because if your employees don’t feel that sense of connection, there may be consequences down the road. And there’s nothing radical about that. So thank you to Margaret for joining us for this episode of Data Radicals. This is Satyen Sangani, co-founder and CEO of Alation. Thank you for listening.
Producer (36:50): This podcast is brought to you by Alation. Data citizens love Alation because it surfaces the best data queries and expertise instantly. The result? Folks know how to use the most powerful data with guidance from the experts. And with Alation, you don’t have to choose between data democratization and data governance. By embedding governance, guidance into workflows, Alation welcomes more people to great data fast. That means your data strategy can play both offense and defense. Learn more about Alation at A-LA-T-I-O-N, alation.com.
Season 2 Episode 8
How can a software engineer create the next big thing? According to Matei Zaharia, creator of Apache Spark and co-founder of Databricks, it demands a single architect to build the cathedral – and an open bazaar to empower the masses. In this conversation, Matei shares his startup philosophy and reveals exciting advancements with Databricks Unity Catalog and Dolly 2.0, an LLM for enterprise.
Season 2 Episode 3
“Vulnerability” is not a word you hear often in tech, but it forms the foundation of success for AI expert Jepson (Ben) Taylor, Dataiku’s chief AI strategist. In this conversation, Jepson reveals the passions — and struggles — of launching (and working for) a startup, how to embrace “failing fast,” and the role of human connection even in the realm of “artificial” intelligence.
Season 1 Episode 2
This episode features Cole Nussbaumer Knaflic, founder & CEO of Storytelling with Data, and author of two best-selling books, “Storytelling with Data: Let’s Practice! and Storytelling with Data: A Data Visualization Guide for Business Professionals,” which have helped tens of thousands of students tell powerful data stories.