Maddy Want is the VP of Data at Fanatics Betting & Gaming, where she created the data strategy and built the data team as the online retailer Fanatics expanded into U.S. sports betting. She is also the co-author of Precisely: Working with Precision Systems in a World of Data. With an interest in the intersection of public policy and technology, she holds an MPA in technology policy from Columbia University.
As the Co-founder and CEO of Alation, Satyen lives his passion of empowering a curious and rational world by fundamentally improving the way data consumers, creators, and stewards find, understand, and trust data. Industry insiders call him a visionary entrepreneur. Those who meet him call him warm and down-to-earth. His kids call him “Dad.”
Producer 1: (00:00) Hello and welcome to Data Radicals. In this episode, Satyen sits down with Maddy Want. Maddy has over a decade of data product experience spanning diverse web and app services. When she joined Fanatics Betting & Gaming, she was tasked with creating the data strategy, hiring the data team, and partnering with tech. Previously, Maddy served companies like Audible, upday and Index Exchange. In this new episode, Maddy discusses her new book, Precisely, data governance and why precision matters.
Producer 2: (00:32) This podcast is brought to you by Alation. The act of finding great data shouldn't be as primitive as hunting and gathering. Alation Data Catalog enables people to find, understand, trust, and use data with confidence. Active data governance puts people first so they have access to the data they need, within workflow guidance on how to use it. Learn more about Alation at alation.com.
Satyen Sangani: (01:01) Maddy Want is the Vice President for Data for Fanatics Betting & Gaming, and the recent author of the book Precisely: Working with Precision Systems in a World of Data. The book reveals how firms like the New York Times went from digital novices to leaders, transforming their legacy tech stacks to precision systems. She has served as the senior director of product management for Index Exchange, and held similar roles at Audible and upday. She holds a master's from Columbia University and a bachelor's degree from the University of Sydney. Maddy, welcome to Data Radicals.
Maddy Want: (01:33) Thank you. Thanks for having me.
Satyen Sangani: (01:34) Let's dive into the book because certainly that's the core insight. Why did you write the book? What is the book about?
Maddy Want: (01:39) I was in the third of three years of the master's degree at Columbia, basically studying the intersection of tech and public policy. I was spending a lot of time explaining to my fellow students, to my professors, through essays, basically whenever possible, the tech perspective on tech and policy issues. There were not many people from technical backgrounds in that class, and I came to see a huge pattern in basically what I was describing. Every time that I would describe a new emerging technology, its potential impacts, just helping people understand how to think through technology, I would notice that I was using basically the same framing of “there was an old way of achieving something and now with this new technology, there is a new way of achieving that same thing, and it is so wholly different,” and the best way to think of it is through the lens of it is a step function increase in precision in our ability to do things precisely.
Maddy Want: (02:35) The name of the book is very literal, and from that name, we coined the idea of a precision system. And to anybody who works in tech today — probably the majority of the people who will be listening to this podcast — these are the systems that we mostly take for granted. These are the products that we build for customers. They're the systems that we use to build the products for customers. They're cloud-based, big data fueled, they're based on cultures of experimentation, optimization, and we basically have expectations that things will be performant, precise, reliable, and delightful, and that's what we all aim for. But the vast majority of the world does not work in tech, obviously, and some of the concepts that we take for granted in tech are very, very foreign.
Maddy Want: (03:20) I was trying to, and so was Zach [Tumin], my co-author, trying to find ways to frame technology in a way that was totally accessible to people who have no technical background whatsoever and never will. Explaining the functioning of a specific technology is not gonna be helpful. It doesn't always resonate. Instead, explaining the capability that it gives its customers or its users or society at large, that tended to resonate better, and so this idea of precision systems really took hold.
Satyen Sangani: (03:46) And tell me a little bit about the degree that you achieved where tech wasn't the primary focus. I think it was a Master of Public Administration. So, you worked for a little bit after you graduated from your undergraduate, you then went back to do a master's degree, as most people are wont to do. Why get this public administration degree versus another technical degree or something that was more in line with your work history, and what was it that you were trying to motivate change around?
Maddy Want: (04:08) I think at that point in my career, I'd been working for eight or nine years and I had been in and around customer tech the whole time. Number one, I was looking for just something fresh, something slightly different. I was very interested in what I was reading on the weekends, was a lot of public policy deep dives, features, like Brookings Institute type of content. I started to get really interested in and focused on the policy content that was digging into implications of technology. That intersected nicely with my background and I thought, this is a natural extension, this is something that's gonna help me do my job better today and also create more breadth so that tomorrow I could do something else that's further into that world and a little further away from my current world of tech. So it sort of served both purposes, and going through it, it is a Master of Public Administration, which, for those who aren't familiar, is basically the MBA of the public sector.
Maddy Want: (05:02) It’s a lot of finance, management, economics, that sort of thing. And then in the the electives, you can choose to dive into whatever you're passionate about, and for me that was the tech side. At the time, I continued working while I was doing the degree. At the time, I was really still deeply in the area of privacy, and that was such a huge dominant topic at the time. It still is, but it's losing steam compared to the GDPR years of 2018, 2019. And that was very much a global conversation and a conversation about how societies — and, as much as they're interconnected, countries — are gonna legislate technology and the impacts of technology on privacy specifically. So it felt very, very relevant for that purpose.
Satyen Sangani: (05:45) And I would expect that whether it's an MBA or an MPA, you would see people who are, if they're not technically inclined, at least understand the world of technology. What was the disconnect between their frame and what didn't they quite understand about the world of technology, and where did you see the disconnect existing between their comprehension and what was happening in the world around us?
Maddy Want: (06:05) It was really interesting, actually. I saw a couple of themes. One was, people who get into a public policy degree are often coming from deep into public policy. They're coming straight from government or they're coming from nonprofit enterprises that are funded by public funds and they just really haven't ever had a need to deep dive into the kind of technology that most of the tech world builds and builds for. The conversations that I was often finding myself having in class and with my classmates were skeptical, I think, about technology and often skeptical to the point of resistant. And I've got a good example, it was just one of many. In one class, we were discussing sensors that parking garages put under each parking spot and when a car is on top of them, they'll sense the presence of the car and that will be used to fuel a sort of red and green list at the beginning of the aisle to say, we've got three free spots in this aisle, which can educate you about whether you're gonna turn down that aisle or not.
Maddy Want: (07:09) I came to understand that the rest of the class thought that those red and green dots at the start of the aisle were instead being fueled by camera imaging that was being taken of your car, of you, of your number plate, and so they were very much thinking of this software as unnecessary, potentially dangerous, potentially privacy concerning software. And all it took was a brief explanation of they were actually sensor-powered; they don't gather any other information. And that was resolved and we moved on and it was that kind of thing over and over again. I feel like I spent a lot of time defending tech, but all I was really doing was disambiguating to the extent that I understood the tech, disambiguating how it really worked and my perspective on what the dangers it posed were.
Satyen Sangani: (07:52) Yeah, it's interesting. I mean, something that would otherwise have been manually done, in that case and many others, are getting done through the intermediation of technology, and certainly in that case, there is no human in the loop. But I think a lot of the criticism of tech is that there is, what formerly was a human interaction or multiple humans interacting with each other has been disintermediated. Even though there is a digital medium, that digital medium doesn't quite capture the fullness of how people interact. There might be more precision from a more narrower sense, but maybe the breadth of the sophistication and depth that two people talking to each other might have doesn't quite exist. I think that feels like where a lot of people are losing trust because they're feeling like there's less context. Did that come up? How do you think about that overall?
Maddy Want: (08:37) We definitely treat that in depth in the book, and so I did spend time thinking through that a lot. We wrote this prior to the explosion of ChatGPT and similar generative AI technologies, which really have taken those same problems and just turbocharged them, made them feel much more urgent. The impact is much more clear, et cetera. But a couple of years ago, those same problems existed. They were just in the format of AI more generally, and to what extent we would allow technology to make decisions on our behalves, and in what circumstances that would be acceptable and what kind of accountability would be needed from the technology itself and from those who offer the solution via the technology. That sounds very broad, so I'll give an example.
Maddy Want: (09:24) Apple and JP Morgan partnered to launch a credit card some years back. You may have heard of it, you may not have, it didn't go fantastically. I'm not even sure if it's offered anymore. But one of the early struggles that it faced was the algorithm by which a customer or an applicant, I should say, was being assigned a credit score was completely AI-driven. And not even the customer service representatives who you might call to understand why you were given a certain limit, they couldn't explain it. The way that the algorithm had been productionized was such that there was no explainability offered either to customers or to the people that interfaced with customers. And so people started noticing discrepancies. There's a very popular tech personality who noted that he and his wife, despite having incredibly similar backgrounds, he got a four times the credit limit that she did. And so he was wondering, is there sexism at play here because that's the only variable he could find.
Maddy Want: (10:21) Questions like that about fairness and about explainability and accountability came up. Apple and JP Morgan happened to handle that terribly because they both respectively declined to take accountability for that decision and said, "Well, we use an algorithm. This is what the algorithm said." And to me, that kind of positioning, it's going to be the reason why a lot of things fail or go badly. I can't believe 15 minutes into a podcast and I'm saying, “The way to make AI successful,” but I really do strongly believe there are very few situations where we'd be happy to let a machine fully make a decision autonomously on our behalves and we would guarantee that we would take accountability for that situation. But I do think that's what's needed, and the higher the stakes are on the product, the more critical that is. So something like a credit limit really affects people's lives.
Maddy Want: (11:10) If it's something else like, “Why did you recommend me this pair of shoes instead of that pair of shoes,” much lower-stakes game, maybe no explainability needed. But I'm not seeing that develop much. That's not coming through in the offerings that I'm seeing, especially with generative AI. We're all thrilled about the incredible capabilities that can be driven, but when push really comes to shove, and especially in a commercial or a product context, what kind of customer interaction would you really allow a genAI to drive on your behalf and you would stand by anything that it would say? Close to none. And to me, that's a huge part of why the current enthusiasm and a lot of companies directing resources toward genAI today are not likely to find fruits to that labor, is because we may have had a step function improvement in generation, but we haven't had matching improvements in explainability and in trust development and in controls. And so it's not gonna be that useful until we get there.
Satyen Sangani: (12:05) Which makes complete sense. Many algorithms basically are on some level not explainable. I mean, neural nets are famously unexplainable. In many cases, a lot of the sophistication comes from a working mechanism that often doesn't quite have a clear, logical, step-by-step plan. So that seems like a fundamental challenge in the space overall. In those scenarios, how would you respond?
Maddy Want: (12:29) I think maybe naively, but you can still say something. You can say, how was this result produced? Or why did I get this number? Or whatever it is. And you can say, we use an algorithm. It takes into account the following factors. It finds relationships between them that are not always explainable or at least can't be detected by the human eye, and the number that it outputs, we have assessed to be accurate 85% or more of the time. It may not explain why that output was output, but it does explain a little bit more about the factors that went into it and the expected accuracy so that people can just be a little bit more informed about how it came to be instead of having a result presented to them with zero context.
Satyen Sangani: (13:09) It's funny, I made the same argument too. I was talking to somebody on the same premise, which is that, even if you can't explain the algorithm, you can explain the method for getting to the algorithm and you can explain the libraries that you use, the data that you actually use, to the extent that it doesn't reveal privacy. There is a lot you can do in order to bring transparency to AI. Maybe switching gears a little bit, I wanna go back to this, the title of the book, Precisely, because I think there's a lot of things that you mentioned. There's digitization, there's obviously speed, there's performance, and so there's a lot of attributes of technologies and of bringing something from a offline or manual or un-automated world to digital world. But why is precision the key attribute that you chose to focus on and why does that to describe so much of what you are concerned with in describing the move to a digital world?
Maddy Want: (13:53) I'll give three examples and I'll run through them quickly and then I'll pull the theme out. So the first one is a company or originally it was a nonprofit called Zipline. They operate, the headquarters are in San Francisco. And what they did about 10 years ago was work with the governments in various African countries like Rwanda, I think Ghana, regionally around that north central African area to basically launch a drone-driven urgent medical supplies delivery service. That meant blood, it meant life-saving medicines. Wwhat they would do was basically have a pick-and-pack operation where a doctor would call up and say, I need the following delivered to this patient who lives here. They would, within 10 or 15 minutes, load that supply onto a drone and send the drone on its way, and the drone would deliver within about 3 or 4 meters of the measured location. You'd have somebody there standing, ready.
Maddy Want: (14:49) If you think about the comparison to the other delivery options, like what would the traditional supply chain infrastructure have looked like? We're talking about locations where there isn't always a road, there isn't always a street address. Certainly, the infrastructure required to carry something like blood, which needs to be maintained at a certain temperature, huge amount of risk in the supply chain for that, let alone the delivery time. It's unlikely to have been within a couple of hours. This precision and having control of exactly where it's delivered, exactly when it's delivered, making sure that it's the exact precise amount and type of supply that was needed is just so vastly different to what the alternative have been that you can't even measure it as an evolution of the original. It's a completely new beast.
Maddy Want: (15:34) Second example is a company called Zest Financial. I think it's a popular MBA use case, so you may have heard of it, but the basic idea is this company figured out a way of lending credit to people who had basically un-lendable, un-creditable credit scores, very bad credit scores, very bad credit histories. They did that by being more precise about who to offer credit to, how much to offer, and why to offer it. A great example is a traditional credit score calculation mechanism will take into account how many payments somebody has missed on prior loans. Zest went one level deeper and calculated how many people called up the bank in advance of the missing payment to alert the bank that they would be missing the payment versus just letting the payment date come and go without any acknowledgement.
Maddy Want: (16:25) That was one type of signal that they would pull out to try and really tease out, okay, well from this group of people who have terrible credit, there are more diligent and less diligent people in there. There are people that we can lend to and we just have to find them. The challenge was about using data more cleverly to break up a big homogenous group of people who were effectively unbanked and find a way to offer them credit by understanding more about them and being more specific about what they could offer.
Maddy Want: (16:52) One of my favorites, Uber, everybody knows Uber. It's a great example. When I was a kid, the way that you would order a taxi was that you would pick up the landline and you would call a taxi company and you would say, “I live here and I need to go there and can you send someone,” and a taxi may or may not show up at your house. There's no timeframe guarantee of when they would show up. You have no control over the price. That's it. And best of luck. To think that within my lifetime, now with the supercomputer in my hand, I can not only summon a car directly to my door, I can be price competitive with it. I can avoid surge pricing, I can look at different competitors, I can see exactly where the car is down to the meter at any given moment. I have just an incredible amount more of control, and that is delivered through, again, just a foundational shift in the precision that the technology is offering me.
Maddy Want: (17:43) Instead of knowing nothing about who would come to get me and when, knowing nothing about the expected price and basically having no options, we've gone to total visibility on location, total visibility on price and ability to shop across competitors. To me, the big theme out of all of those things is it's not about the technology itself, it's not about drones or it's not about auction mechanics like the power Uber. Those things are cool, but it's about the capability that it's given to the customers or the patients or whoever. The theme there is that they have more precision. They can be more precise about what kind of change they're requesting or they're affecting and they can have an outcome that's much more tailored to them, and instead of interacting with some sort of a block system that only does block functions in a block way.
Satyen Sangani: (18:29) So it's sort of a continuation of kind of the Starbucks/Netflix phenomenon. I can get sort of exactly what I want, exactly when I want it. And it's because the digital medium provides me with the ability to customize products and services effectively on demand. Your first example reminded me of this technology, which I had not heard of, but it was called what3words. And of course it was the European friends that shared it with me, but it was great because you can use any three words to literally within four feet, have a meeting spot where you otherwise might not do.
Maddy Want: (18:45) Incredible.
Satyen Sangani: (19:00) Yeah. Which struck me as a really interesting example of this, but you're right. There's so many others. So as we think about this idea of sort of precision delivery of services and product, there is a flip to it, which is like, do I need all this choice? And like, do I really, does it really matter if my latte has like a lot more foam on it or doesn't? Am I that much happier? You know, I think as you think about public policy, there's certainly some questions around that. Is precision always better in your mind?
Maddy Want: (19:24) No, definitely not. I think I would go so far as to say probably yes on balance, but certainly not a blanket yes. The same systems that deliver us videos that we're interested in on TikTok also deliver us the most highly addictive content algorithm that we've ever seen. That's having lots of social impacts on young consumers that are measurable today. Being so effective at targeting what somebody is specifically interested in is great on one hand and it's dangerous on another hand. I'm choosing examples that are uplifting and success stories, but not all of them are. There's some great examples in the book of where these attempts to make something analog into something at the forefront of precision, they either fail through interesting reasons or they're not better off at the end of it.
Satyen Sangani: (20:18) Yeah. You started writing the book while you were in the master's program or did you start writing it... You did. And you did so with a professor at Columbia, is that right?
Maddy Want: (20:27) Right. Zach's a professor at Columbia. He's the former deputy commissioner of the New York Police Department. He has fantastic thoughts and decades of experience across a few different industries. The combination of him and that persona with me, sort of young tech native, that was a great mix.
Satyen Sangani: (20:44) Yeah, it sounds like it. And when you start writing, at least I find like writing helps me understand things. So in this case, like you write, you've written the book for, or you were trying to write the book for four years. What did you learn during the process of writing it that was unexpected?
Maddy Want: (21:00) So, so many things. That's tough. Definitely the best part of this was the interviewing. I think two of the four years was spent interviewing people, just probably close to 100 people. These were wide-ranging conversations where we took notes and we just started to unpack afterwards. That took another year. Just being exposed to so many brilliant people from so many different industries over enough time — and like you said, it was a few years — it gave me a sense of sameness, a sense of sameness about the direction of the work that these people are doing or trying to do. It gave me a sense of sameness about the way that human brilliance presents itself, no matter what the endeavor is. It definitely gave me a strong sense of sameness about technology and the way it emerges, the way it gets adopted, the problems it faces, and then its eventual decline or succession or whatever it is.
Maddy Want: (21:52) And it was fascinating to come to that kind of meta conclusion, given that we were talking to people who write algorithms to classify the genre of a song. And then the next day we would go and interview people who attached computers to John Deere tractors for precise distribution of weeds or fertilizer, depending on whether the computer vision recognized the plant in front of it to be a weed or a crop. These were the different conversations day to day. A lot of people in radiology and medicine, et cetera. Somehow, over time, we kept seeing that the problems had so much in common, way more than we ever would have predicted going in. We wouldn't have predicted anything going in. We had no idea what we were getting into, but it was just fascinating to write it out.
Satyen Sangani: (22:37) So this sense of sameness is the patterns that you saw across the various examples in precision and how these algorithms are developed and how they work and how people approach the information. As you know, this is a podcast about data culture, and we think a lot about this idea of how you get people to use data. Were there any lessons coming out of the work around people using data more, being less intimidated by data, being more scared of data through the process of their working with these systems? And how did you think about the social implication and the social change of technology being introduced more to people's work lives and actual personal lives?
Maddy Want: (23:12) There's a saying that I think is common in tech, which is “The tech is the easy part, the people is the difficult part,” which is often used in reference to people and org complexities. It sort of translates nicely in the sense that if one person or one team is sort of out there in a bubble trying to achieve some sort of cultural change relating to how the society or the organization that they're in uses data, the likelihood of success in an environment like that are very, very, very, very low. That's going to be a very difficult journey because effectively, if a company or an org has been operating in a certain way, and that way has been driven by instinct or social relationships or top-down sort of direction, proposing a change to that, that not only requires people to let go of the power structures that they've been accustomed to, but also requires them to wrap their heads around a technology, which they may have never met before… that’s huge.
Maddy Want: (24:06) And it's it's just unlikely to succeed. We go into a lot of depth about if you are such a person or such a team, how do you assess the landscape around you to understand what kind of situation you're in and adapt your tactics based on that. An example might be, in the New York Times case, there's a long, robust culture of the editorial, the newsroom. There's plenty of movies made about brilliant journalists running around, holding papers and begging for the top page, the front page tomorrow and that kind of thing. And that is real. To be transitioning to a tech company, hiring increasingly technical people, and then saying to those people, “You have to figure out how to make a digital homepage make way more money than it's currently making.” Or “You need to figure out how to transfer our business model from being impressions or eyeballs driven to being subscription driven.”
Maddy Want: (25:00) And maybe the conclusion that you came to, which is certainly what the teams that we spoke to said was, “Okay, well, we're going to have to experiment with different homepages shown to different people.” Just introducing the idea that there isn't just one homepage. We could get to a place where there might be two homepages. There might be 100 homepages. There might be n homepages where n is number of customers. That idea of like fragmenting the customer experience was very tough, very, very tough to get across. The editorial psychology had been, “It's our job to tell people what they need to know when they don't know it. That's why we curate the news. We choose what sits at the top and that's never going to be handed over to an algorithm.” There was this push and pull between sort of the tech directive, which was coming from the CEO. They did have a lot of mandate and authority that had been granted to them, luckily, but that didn't make it easier.
Maddy Want: (25:55) They still had to negotiate with the editorial teams for a long, long time to say, “Just give us a tile at the bottom of the homepage and let us play with what we show in there to see if we can get clicks to change or whatsoever.” Over time, that became an experimentation culture, a data culture, but it did take years and the team driving it had to earn every single win and every brick that went into the house that is now a multi-billion-dollar subscription business. They had to build it and show the value and rinse and repeat. They got zero credit. They got zero advanced trust. They really had to carve it out from a very entrenched culture. It's not to say a good or bad thing. It's not a value statement. It's just to say achieving change is really, really hard and achieving change where you're proposing people to rely more on data when they don't really even know what that means: Very tough.
Satyen Sangani: (26:47) In that case, journalism has this sense of being isolated from the consumption of it. In some sense, that's the first draft of history. You're supposed to tell the stories that people need to know. You as the journalist need to obviously discover those stories. Sometimes those stories might not be popular or very interesting. As a journalist, a lot of what you're trained to do is very different from capturing people's eyeballs, because on some level, the story should be independent of whether people are interested in them. And maybe even counterfactually, the ones that need to be told are the ones that people are least interested in. There's almost a value mismatch there. Did the values of the organization change in that case? Or how do you navigate that? Because that feels like the hardest piece in that scenario, but maybe the most instructive, because probably a lot of data transformations demand that.
Maddy Want: (27:31) One of the companies that I worked at, the one called upday that you briefly mentioned, is a news app that is preinstalled on all Samsung phones — S line, specifically S7, S8 — in most of the EU. So it's a German based publisher called Axel Springer that I was working for. That's the way that the product was distributed. It was pre-installed onto Samsung phones. That's what the partnership was. We faced this exact same problem; that was close to five years ago. And now we're talking about the New York Times, obviously a much bigger beast, but very similar problems. How do you build and deliver a product that meets so many often contrasting requirements? It's got to deliver the important news, that's probably its core purpose.
Maddy Want: (28:11) It's also got to be engaging because there's no point delivering a product that nobody would visit twice. I mean, it's also got to make money. How do you marry those three things? In the end, that surfaced in a two-part product strategy. Specifically, what we ended up with was the page that you would land on is the top news. On the bottom navigation, there was top news or home or whatever it was called. And then the other tab was “for you.” And this concept of “for you” is very well understood by consumers across a range of different products. Now, it's understood to be recommendations. It's understood to be different every time that you look at it. It's understood to change more frequently than the singular editorial experience is expected to change.
Maddy Want: (28:52) And it's also where people's eyeballs often go by choice because it is more relevant to them. But having people land on the important news first kept the top of the funnel, the highest awareness, real estate was on the core point of the product. And then most of the revenue was generated through the “for you” feed, where we also did the integration of the advertising. That's specifically the part that I was working on was integration of advertising into the “for you” flow and targeting that advertising to take into account what interests the customer had displayed in their navigation. The same inputs that were powering the personalization engine ended up powering the ad targeting as well. The goal there was to make the ads at least as relevant as possible, right? That's a win-win for everybody because people are more likely to click on them, advertisers making more money, etc. That's the way we worked on it in that company, on that product. And what The New York Times landed on is a much bigger, much more complex sort of evolution of that same idea.
Satyen Sangani: (29:50) Yeah. I guess the optimistic telling of that scenario is you want to meet people where they are. If you're telling a story, then you have to meet people where they're willing to hear it. If you can do that better, then your stories will have ultimately more impact and your products will have more impact. Just switching gears a little bit. That brings us to sort of what you do now and today. You're at Fanatics and you're the VP of data for betting and gaming. Tell us about Fanatics. Tell us about what that job entails and what you do there and what you're focused on.
Maddy Want: (30:19) Fanatics is a well-established American company. For those who are listening and haven't interacted with Fanatics before, a lot of sports merchandise, effectively. If you go to Fanatics.com, it will have all the jerseys, all the memorabilia of your favorite team or of any team, really. It's an incredibly broad enterprise and the supply chain behind it has also been well-studied. Fanatics has done a fantastic job historically of building out such a highly controlled supply chain function that they're able to manufacture a lot of the produce that they sell. It's sort of taken an interesting approach in a couple of respects. But the point of sharing that backstory is to say this is a retail company at heart and it's a physical retail company. And it's also become an e-commerce retail company. It's been operating that way for the better part of 10 years. What the leadership at Fanatics decided to do probably four or five years ago at this point was to take the Fanatics brand, which is a very well-known sort of household brand in the States, and venture into new businesses with that brand.
Maddy Want: (31:23) The betting and gaming business is one of them. That's the one that I work on. What was the data culture when I arrived? Well, there wasn't any at FBG, Fanatics Betting & Gaming, because it didn't exist yet. The data culture at Fanatics.com, I'll say, was very interesting because it had developed as an evolution of the business as the business evolved from retail to a sort of hybrid retail and e-commerce structure. They had incredible knowledge of and visibility into the entire supply chain, manufacturing, merchandising, sales process. Where we were coming from at Fanatics Betting & Gaming, it's the business model that a betting and gaming site or experience works from is engagement-based. You want people enjoying the app. You want them coming back and spending time and interacting, even if there's no currency in the interaction, even if it's just sort of a browse. Like, that's a good thing.
Maddy Want: (32:17) That's not necessarily the same experience that an e-commerce website is aiming for, where it could be a very directed search-to-conversion type of goal. we were very interested from the get-go in understanding the value of customers, no matter what their behavior was on the product. We took a sort of a customer-first in comparison to maybe a supply-chain-operations-first approach to data. What that has become over the course of about 18 months with beautiful collaboration — and having the right data partners in place has been a huge part of that, too — is sort of a view of the customer that now spans across a couple of different businesses. That's really what I'm trying to achieve overall. That's the point of me. That's the point of me being here. So, yes, sure, I want the data operation for FPG to be the best in all of sports gambling companies in the world.
Maddy Want: (33:11) That's like a table-stakes outcome. It has to be. It's on track to be, if you ask me. (No brag.) But really, if we want to make something very special out of this in the long term, it's much more about the Fanatics ecosystem and having awareness of and the trust from the customer across a variety of different businesses, making it feel like the most valuable place to be and spend their time and spend their money. That has been an incredible challenge and experience, sort of jointly launching a new company and having all of the data challenges that come from building something very quickly from scratch, simultaneous to operating a well-established existing company and figuring out how they work together, sort of all at the same time.
Satyen Sangani: (33:51) The claim that the operation that you're running is one of the most sophisticated in betting and gaming is a big one, because these are businesses that are basically all fundamentally data businesses. Tell us what you have done and tell us about your journey specifically. So what did you do as soon as you got there? We talked a little bit about technology, people, process. But what were the technologies, people and processes that you focused on first? How did you think about that work and what progress have you made and what progress is left yet to be made?
Maddy Want: (34:17) First things first — and then next things next. They sort of were very fast follows in my approach. Yes, this is a regulated business. This is a business with very strict data requirements from a range of different regulators, financial auditors, et cetera. Just the baseline that we had to have in the quality of the platform, the quality of the data, governance practices, et cetera, it already had to start way up top. There was no maturation period that would have been acceptable for that where we could have said, oh, we're not quite there yet. The first people that I hired were engineers and data governance analysts and people who could build out the reporting and visibility that we give to regulators and financial consumers, I guess. But it would not have been enough or it would not have been appropriate to say, “That's what we're going to spend the first six months doing, and we'll get to other things after that.”
Maddy Want: (35:07) It really all had to happen in parallel. We launched, I would say, two other major categories of investment. The first major category was in the sports data itself. And so that comes in the form of the models that we use to predict the prices or predict the odds that will be offered for every game, every event, every selection side on each event, ranging from the NFL game that plays tomorrow all the way through to a futures price for the Super Bowl across 10 different leagues, across dozens of active states, et cetera. There's a massive proliferation in just the usage of prediction as a basic foundation of the business. You really see that in any company that operates as a marketplace, whether it's a one-sided or a two-sided marketplace, one-sided in the case of betting and gambling, where the book is the platform, but also the seller.
Maddy Want: (35:55) You see stock prices are calculated the same way. It's based on demand. So are odds, so are prices. It's based on the likelihood that a certain event will happen. It's also pushed up or down by the volume of people that believe that and place a bet accordingly. There's a lot more to sports data than just odds and line generation. There's a huge amount of intelligence to be squeezed out of more deeply understanding the sports, the players, the relationships between the players and the teams, the past performance of players, league relationships, et cetera. It's a very complex world.
Maddy Want: (36:27) There's a lot we can do there specifically to support our trading operation. Every market that we offer is being operated by real human traders. We have a massive trading team and they need all sorts of insights in order to operate to the best of their capability. We put a lot of investment into supporting that. The second major category is everything related to customer and product understanding. That is a huge pillar of people who work on everything ranging from personalization, recommendation algorithms through to lifetime value and predictions of various other kinds that are used by product teams, marketing teams, executive visibility, anybody that you can imagine. That's also where a lot of our partnership with the rest of the Fanatics businesses comes from. Between sort of those three pillars and that being sort of the sports insights, the customer insights, and then the table stakes, regulatory governance, controlled reporting environments, that's been the portfolio approach to date. It's been very, very fast growth, both in the size of the team and the size of the operation. We've barely even started in the scheme of the journey, so I know it's about to get crazy.
Satyen Sangani: (37:33) And how did you balance investment across the three... Every data leader suffers this challenge of how much do I invest in the platform and the infrastructure to make my day-to-day actually go somewhat efficiently, somewhat easier versus investing in the case and insight that's actually gonna deliver business value, how did you navigate that path?
Maddy Want: (37:52) Two sort of tenets that I held on to at the beginning, and it is a hugely privileged situation or a once in a lifetime... I don't even know how applicable this advice is gonna be to any listener, because how often do you get to walk into a brand new company and hire the entire team from scratch? I may never get to do this again. It's a very different inheriting an existing team or an existing operation, but I will say because I did have a blank slate, I focused on two things, and the first thing was prioritizing enabling functions, prioritizing functions that would let us support other things more quickly over prioritizing specialist functions.
Maddy Want: (38:26) I started off by hiring data engineers and data governance folks, some of the top priority because I knew that the basis for everything that we wanna do is gonna be the quality of the platform and the quality of the data, and there's just no point hiring anybody else, if that stuff is not right. The second thing, which was a bit more of a risk, I think, was hiring very senior individual contributors first and delaying, quite a long time, the hiring of more mid-level and more junior team members and saying, I would rather have 10 people total on the team who have done this before and who can walk together at light speed and get a platform set up that I know what to expect from the quality of that platform, I'd be happy to pay more for those people and do whatever it takes to get them on board ASAP. We'll figure out the rest of the team later. That's kind of what we did, it paid off, but if I hadn't had the resources of Fanatics at my disposal to do that, it could have been a very different story, and I wouldn't necessarily recommend it to everybody.
Satyen Sangani: (39:27) How is the team structured? It sounds like you have resources that directly report to you, is that meant it's a centralized team primarily, or are there resources in the business units that are consuming what you do?
Maddy Want: (39:35) It's not fully centralized, but definitely the center of gravity is in the data work, which is the work that I lead. I've got a handful of direct reports who correspond roughly to those three pillars that I mentioned before, and in addition to that, I've got the engineering function, which is a horizontal function that supports all of these sort of insights. (That's the data engineering function, I should be precise.) In addition to that, I have a couple of principal, very senior level individual contributors who basically do anything and everything, and I rely on them very heavily to make good judgement decisions every day, even when I'm not in the room. They'll help me assess new opportunities, they'll deliver specific workpieces when needed, they're very much like very trusted advisory kind of roles. With that team, especially working together for the better part of a year and a half now, we're really hitting a stride, which is pretty awesome.
Satyen Sangani: (40:31) You mentioned this idea of being privileged and starting with a new team or being able to go from ground zero, it's funny because I think there are a lot of leaders that obviously do inherit teams, some of which are quite large, but I actually think the same thinking is pretty helpful, which is to think about what you would do if it were day one. In the Jeff Bezos framing as it were, because I think a lot of the time, you have more latitude that you think and you ideally wanna break from the past faster than you think in order to achieve whatever the goals happen to be as a data leader. So even if that's not the exact scenario, I do think that's a lot of how people should inform their thinking, you mentioned data governance, and you mentioned that you hired two people, what was their day one work and how did you stand that up? Because that's always a challenge for certainly every one of our customers and everybody that I talk to: “What do I do with data governance? What is data governance? Where do I start?” Where did you start?
Maddy Want: (41:20) I started on the sort of bread and butter, so there was a lot of process creation and documentation related to: what are gonna be the rules for how we share data amongst the company; who we give access to, what and why; what are gonna be the rules about data classification standards; we need names that refer to specific things and correspond to different severity and treatment levels; and we need to keep track of everything. Our data inventory is zero or was zero for a while there, and so we have the opportunity to really have crystal-clear awareness of what data we have, where it comes from, everything. I think the difference between coming from that zero point compared to maybe inheriting a data platform that has tens of thousands of tables that none of the current employees have ever touched, and being faced with sort of an audit clean-up mission of an old data platform, that's a totally different type of challenge, and so for us investing in a platform and platform partners who would help us keep track of everything that we have in a scalable way from day one, that was how the team spent most of the time.
Satyen Sangani: (42:25) Which is different for most people. You don't get to necessarily create your full end-to-end data stack day zero without any legacy or history, so that is certainly super different, every data leader is always in this place where you're trying to get more budget, trying to prove value. How do you know you're being successful? How do you align with success in terms of your next year's budget and getting people to give you more resources and proving success?
Maddy Want: (42:49) That's a tough one. I think I've witnessed data leaders who primarily measure success on stakeholder feedback, and they view their organizations as sort of service organizations in the sense of like, “We're here to provide business insights. If we're doing that, we should be hearing positive feedback from business leaders, and if that feedback cycle is working well, then what the point of measurement really matters?” I think that's a component to it, of course. It's a huge component, but I think there's also a broader responsibility for data leaders, at least in big tech companies like the ones that were mostly familiar with, to have a strategy for turning the company's data into an asset and not just a pile from which insights can be dug out, and that can take a few different forms.
Maddy Want: (43:39) It can take the form of, we're gonna have a strategy to share or package up or derive insights from that can be sold (the data) or we're going to build customer-facing product features that are powered by this data that are going to improve some revenue stream in the product, or we're going to sort of use it as the attraction in a prospective partnership with another company that has other data that we're interested in, et cetera, et cetera, et ceter. I think the leadership required to say over the next three to five years, here is what we're gonna turn our data into, that's the missing component, and it takes a long time to prove out, whether you've been successful at that or not, which is why I said it was difficult. Eighteen months into a strategy like that, maybe there's a lot to show for it, maybe there's less to show for it, but it's a long game, it's not a short game. Combining that with feedback from the organizations or the teams internally that you partner with, or that you serve and keeping a pulse on how well you're doing on that front is also important. It's just not a complete source of feedback.
Satyen Sangani: (44:42) Yeah, I agree with the point that it's historically been subjective, given where we send the ecosystem, one of the things that I'm super passionate about is this idea of assigning value to data initiatives. Outputs aren’t value, because on some level, activity isn’t value. You actually have to show a demonstrable impact, and even if we can always measure the linear straight through impact, it's definitely the case that you can start to try and start to quantify these initiatives more. Looking forward, you've written a book, you're now obviously in a fairly privileged position with a job, it sounds like you really enjoy and obviously have a lot of impact with, what is top of mind for you as you and every other leader are faced with GenAI and more data and more expectations, what are you excited about? What are you struggling with? Where do you see the next three years of this five-year journey that you're gonna be going on?
Maddy Want: (45:33) Launching a “startup,” which I put in quotes because we're not bootstrapped for funding while facing many of the startup challenges, but we are a startup in the sense of the product didn't exist and now it does, and now we have to grow it, and so scale is the name of the game for the foreseeable future. We're gonna get out there into every possible market, we're gonna diversify the product portfolio, et cetera. For everybody, that means putting to the test a lot of assumptions that we've made, a lot of bets that we've taken — no pun intended. For example, how is this team gonna scale. I've planned for it to be able to scale, but will it?
Maddy Want: (46:07) Going through that scaling is sort of a day-by-day, week-by-week test, and I'm looking forward to that. It's also what's most nerve wracking for me is just thinking about the rocketship pace of growth, and I really want to make sure that elements of the expectations that we have for people in the handling of data, in the sort of technical excellence that we expect from the team are preserved as well as a lot of the team culture that we have, the company culture that we have around data, I think is fantastic, and I really wanna see it hold in place as we go through the next couple of years.
Maddy Want: (46:37) That's one thing. I'm also very conscious of the team and making sure that everybody who comes on board has a career path that they are excited about and passionate about, and that we don't let pockets of the team sort of start to lag or not have any clear purpose, or folks start to feel disengaged or anything like that, and that also gets more difficult with more going on and with a bigger team size, so I spend a lot of time thinking about both of those things.
Satyen Sangani: (47:04) Super helpful. It sounds excellent. I'm sure a lot of people are really loving and enjoying working for you. Maddy, it's been a really fun conversation. Thank you for taking the time and thank you for joining.
Maddy Want: (47:13) Thank you for having me.
Satyen Sangani: (47:19) Maddy makes one thing clear: Precision is the hallmark of technology. Precision matters. Her examples of medical deliveries, we had drones in Africa, lending to unbanked people, and even Uber, tell us why precision is so vital. Starbucks and Netflix have made personalization mainstream. Customers can get exactly what they want when they want it. Even if it's the outcome of tech, precision might also do harm. Take TikTok as an example. The app uses an addictive algorithm to feed tailored content to its users. Many of them are very young, that addiction impacts their mental and emotional health. Precision is powerful. It can make technology better for customers. It can also hook them. What does this mean for you as a data radical? Give people precision, but also give your customers the tools they need to know whether they're creating value from that precision. Thank you for listening to this episode. And thank you, Maddy, for joining. I'm your host, Satyen Sangani, CEO of Alation and data radicals: Stay the course, keep learning and sharing. Until next time.
Producer 2: (48:16) This podcast is brought to you by Alation. Does data governance get a bad rap at your business? Today, Alation customers wield governance to drive business value, increase efficiencies and reduce risk. Learn how to reposition data governance as a business enabler, not a roadblock. Download the white paper, Data Governance Is Valuable: Moving to an Offensive Strategy at alation.com/offense.
Season 2 Episode 17
Everyone’s talking about GenAI, but there's so much we still don't understand. Tamr co-founders Mike Stonebraker and Andy Palmer break down its impact and limitations in the realm of data integration. They also discuss deep learning vs. traditional machine learning, the rise of data products, and the collaborative spirit that fuels their pioneering work.
Season 2 Episode 6
To fully understand quantitative "big" data, you need qualitative "thick" data that reveals human emotions, stories, and world views. Tricia Wang, thick data expert, helps organizations decipher the qualitative, human meanings lurking in quantitative data to fuel meaningful innovation.
Season 1 Episode 22
Want your data to be a competitive asset? Make it FAIR — findable, accessible, interoperable, and reusable — and you’ll reduce the silos and improve efficiency. Francesco Marzoni explains how to apply the data management principles of FAIR at your organization to empower more people to derive the most value from your data.