Based in Sydney, Australia, Foundry is a blog by Rebecca Thao. Her posts explore modern architecture through photos and quotes by influential architects, engineers, and artists.

Episode 164 - Anne Griffin on Attracting Employers and AI Product

Episode 164 - Anne Griffin on Attracting Employers and AI Product

Today’s guest is Anne Griffin, a leader in product management, a startup advisor, and subject matter expert in AI, blockchain, tech ethics and product inclusivity.

About Anne Griffin

 
DSC06084.jpeg
 

Anne is a leader in product management, a startup advisor, and subject matter expert in AI, blockchain, tech ethics and product inclusivity. She is the founder and Chief Product Officer at Griffin Product & Growth, and the Chief Product & Curriculum Officer of Tech 2025, an emerging technology community and platform to teach professionals to prepare for the future of work. Her workshop Human First, Product Second teaches organizations and professionals how to think about building more inclusive and ethical tech products. She has lectured at prestigious universities across North America such as Columbia University, the University of Montreal, and Morgan State University, spoken at major events such as SXSW, and created courses for O’Reilly Media. She has built her career in tech over the last decade working with organizations such as Priceline, Microsoft, Comcast, Mercedes-Benz, and ConsenSys, the premiere blockchain software technology company on the Ethereum blockchain. Anne continues to work with and research the practical human aspects of technology and building products with emerging and disruptive technologies. Outside of her work, Anne is a voracious learner, frequent traveler (when we’re not in a pandemic), and seriously committed to her self-care and workouts.

Anne’s Website: annetgriffin.com
Attract Your Dream Job Coaching Program: attractdreamjob.com

Links

tech2025.com
Agricultural Blockchain Agriledger.io
Convos about NFTs: Tonya Evans
Eva Penzey Moog (studies how tech is used in domestic abuse)

Related Episodes

Episode 160 on NFTs
Episode 157 on the financial Tsunami - the vast changes our economy is experiencing
Episode 145 on Tech Ethics Chaos at Google
Episode 5 on Ethereum and Smart Contracts

Transcript

Max Sklar: You're listening to The Local Maximum Episode 164.

Time to expand your perspective. Welcome to The Local Maximum. Now here's your host, Max Sklar.

Max Sklar: Welcome everyone, you have reached another Local Maximum. Excited for today's guest, which is Anne Griffin, we're going to talk about career and product management and AI and blockchain. And so it's gonna be a really fun discussion. And yeah, so without further ado, my next guest is a leader in product management, a start-up advisor, and a subject matter expert in AI blockchain, tech ethics, and product inclusivity. Anne Griffin, you have reached The Local Maximum. Welcome to the show.

Anne Griffin: Thank you. I'm really thrilled to be here today. And you know, I love your podcast. I love the subjects that you talk about on your podcast. I especially love the recent episode that Charlie recommended to me about kind of our current and or future kind of financial crash. And yeah, I think you talk about a lot of really interesting things in tech and economics and innovation. And so I'm really happy to be here today.

Max: Oh, I appreciate that. I didn't even know that you listen to some of the podcast. You know, I always know that sometimes I get a little bit into territory where I know that some of my guests wouldn't agree with me and I'm, would people still come on my show? But then people do so it's okay. But it's I really appreciate you that. And I know, well you're coming on because we spoke recently on Charlie Oliver's podcast. Is it still called Tech 2025? I think she might have changed the name of it.

Anne: It’s called Fast Forward. It’s Fast Forward. So it’s a Tech 2025 podcast, but the name is actually Fast Forward. 

Max: Oh, gotcha.

Anne: So if you haven’t yet, please go listen to that podcast.

Max: Yeah. Is that out yet? The conversation we had about Clubhouse.

Anne: The conversation we had about Clubhouse is not out yet. But there are other really great episodes. I’ll write them.

Max: Okay. I'm, it will almost certainly be up by the time this episode goes out. So yeah, that's definitely, Clubhouse is really big. We had a really good discussion about what's going on in the world of audio in the world of start-ups. And, you know, I think that, I mean, that's the kind of discussion I'm having here. And, you know, I looked at your bio, and I was like, whoa, Anne is really smart. And she talks about all the things I talked about on the show, so let's have her on.

Anne: I was, Yes, I'm so excited. And I was really just mostly I was like, I'm gonna get to nerd out. This is great.

Max: Yeah. So, um, what is it that you do exactly? So you're, my day job right now is I'm just a software engineer, basically, I'm writing code. Well, I have kind of a special role at the company, but, what, what would you say you do on your day to day?

Anne: Yeah, I'm a product manager, and my nine to five, my five to nine, I do a lot of other things. But I've been in technology. For over a decade, I studied industrial operations, engineering, but I still had to learn C++ as part of my engineering education. So, you know, it really actually landed itself into being a product manager of having the technical background, being able to understand what the engineers go through, and kind of some of these concepts that they're talking about. But also, one of the things about industrial and operations engineering is you focus a lot on optimizing things. And because it's more focused, it's not completely focused on manufacturing, and making things more efficient. And when you start thinking about product managers have to have some project management skills, you start thinking about how do you make things more efficient, and run smoother? Some of that comes in. So it makes it a lot easier to focus on the actual product management when you're easily able to organize things in a way that they just kind of run themselves. But yeah, so...

Max: Yeah, organization is key sometimes.

Anne: Yeah. That's...

Max: We have a lot of trouble with that on my team sometimes.

Anne: Yeah. So that's my 9-5.

Max: So what is that, that you're doing with, I know, you said you're doing something in terms of helping people, not necessarily with their resumes exactly, but to attract the right kinds of companies in their jobs. And I have a long history of very annoying interactions with recruiters. So I'd to know what, because you might think, oh, it's great, a lot of recruiters are after you, but I never, I rarely get something good out of it. What would you say is the—what are you doing there? And what would you say is the problem?

Anne: Yeah, so I run this course called Attract Your Dream Job. And it is focused on people of color. But really it's called Attract Your Dream Job really more so because I have gotten pretty good at getting recruiters to come to me with very relevant experience. And for those—I didn't go too deep into it in my intro, but one of the things that I do work with is machine learning, artificial intelligence. And most recently even taught over the last summer at Columbia University, a pre-college course about applications of AI, as part of an AI and blockchain course. 

And the thing is, about, five, six years ago, I was in tech, I was doing product management, and I was, you know, AI machine learning. I was, I really think these are gonna be a lot more prevalent and everything that we're seeing. I could see, you know, how far along Google and Facebook were with that and saw how far behind everyone else was. And Charlie, that's actually how I found Tech 2025, is because Charlie was one of the few people talking about this, in terms that people could actually understand. And so I was, okay, and I started learning more about this. And I updated my LinkedIn to talk about, these are my interests, not lying about my experience and saying, these are my interests, I'm really interested in AI machine learning, here are the books I'm reading about it, here are the courses I want to take. Didn't lie about any of my experience, but I was, very beginning of my journey into emerging technology. And I had this recruiter from eBay reach out to me for a product manager role that as a product manager, machine learning. And I said, that's literally exactly what I want to do. So I said, yeah, let's have a conversation. And one of the things I asked this recruiter, because I was, this is too perfect. How did this person find me? I said, so how did you find me? He said, I typed in ‘product manager, machine learning, New York City,’ you were the second person to come up. And I was, wow, okay, so I'm clearly doing something right. 

And between, I've worked with great career coaches, have really great mentors, and also I've done some things, experiments on my own time. And I've gotten really good at attracting opportunities that are really great fits for me. I haven't actually been looking for a job during this pandemic. But just during the pandemic alone, some of the names that you would recognize that have actually reached out to me for roles that I would actually be interested in, are, Twitter's reached out, Facebook’s reached out several times, Amazon's reached out doordash has reached out, the The list goes on and on Vimeo, several others, Dropbox. And they're all for roles that I was, this sounds really cool. I definitely want to talk to you. And I'll say, I'm also in a position where I'm not necessarily in a position where I'm, yes I'm going to definitely move on. Right, I'm going to move right now. But it's something where I was, oh, this is great. Because even though I'm not necessarily going to take on these roles, I'm interested in hearing more about what this is, or getting, starting a conversation with some of these companies, because you know, things can change.

So and again, if anyone from my job is listening, I'm not going anywhere, but just be aware that I am a hot commodity. But so this is something where, you know, I teach this to others, because what I noticed is when we talk about job searches, there's a lot of talk about, Oh, do your resume this, or, you know, this is your LinkedIn review. But there's not really a talk about the strategy of why those things work. And we do talk about some of the outbound strategies. But this is one of the few things that I've seen really, period that talks about your inbound strategies, right? if you have kind of basically a job search, SEO or inbound traffic strategy, it saves you a lot of time and stress, especially if you can, don't have to actually look for your next job. Right, your next job comes to you.

Max: Yeah, so I feel you saw these jobs coming in, you're this great, this is what I want. I think a lot of people have problems just taking a step back and asking them, what do you want? I don't get a lot of people are, well, what do you want to do? And they're, I don't know, I don't know what there is. So how to, is that something that you can help people with as well?

Anne: Yeah, that's actually the first part of the course. And we call it finding your North, your North Star, right? Because especially when we talk... 

Max: That's good, because I did not know that when I asked the question. 

Anne: If you actually set that up...

Max: I don't do that.

Anne: You set that up perfectly. Because my thing is nothing I tell anybody is going to matter. It's not going to be helpful if you don't have a North Star. If you're trying to be a product manager and a software developer and a UX designer, the recruiters won't understand what you do and they spend about three seconds looking at your LinkedIn profile and they move on to somebody that they can actually understand, “what the heck does that person do,” right? So that's actually the very first thing we teach in the course, because we get people with these muddled things. And LinkedIn doesn't just look and see, “Oh, that person has this one keyword.” LinkedIn is actually also giving recruiters a score of how relevant you are based on the skills you have. So it's not just, “Oh, they had one keyword, we're gonna throw it in there.” LinkedIn is also saying this many of the keywords and things that this person posts about actually overlap. 

And so the thing is, that's why you need to really be able to focus on something, you need to understand your North Star. Because otherwise, you know, again, you can put all the keywords and everything else in your LinkedIn, your resume, do all this others out, do all this networking. But if you don't have a North Star, no one's gonna really understand what is the heck you do, and no one's just shopping around for employees that they don't really understand what they can do. You have to be able to sell yourself and tell a story. And oftentimes, that story is being told about you, before you actually even get to have a conversation. Your LinkedIn is the beginning of that story.

Max: What do you tell people who are constantly contacted by recruiters who want them to do the same job that they're doing now somewhere else? But they're basically, well, I want a new job, because I want to do something different than what I'm doing now.

Anne: Are we talking about more of seniority or we're talking about more of, this person is doing a career pivot?

Max: Yeah, so basically, if you want to do a career pivot, recruiters will reach out to you with, well, you know, recruiters will try to push you into doing what you've already done. Because it's probably an easier sell for them to the companies. 

Anne: Yeah, so two things: what I always tell all my students is to always write and talk about yourself in the context of where you're going. Your whole LinkedIn should be written in that context. Your, how you talk about yourself, and interview should be talked about in that context. It should not just be here is a history book of, here's a bunch of a history textbook, because they think there are some great books written about history, but there's the lack of context, or just feels this chronological order of this then this then this. And so if your LinkedIn tells the story of Yeah, they have that experience, but it's written in a way that they can kind of see, oh, that's where that person's going, they can kind of see, maybe this isn't a fit, or if they reach out, they'll kinda at least understand the other– 

Second, the second part is, and this is where we people in tech, at least for me, I never learned sales skills, I will only recently acquired sales skills. And one of the things is, um, you know, if somebody reached out to you, great, so now you have an in. So now, this is where you say, you know, I'm not interested in that. But if you have roles like this, then I would be happy to talk to you. Because if they do have a role that is specific to that, then now you actually have an in knowing that, oh this role just opened up. It's not posted yet. Or maybe it's posted and you didn't see it. Or maybe you didn't know about this company. So you'd never even saw what was posted here, right? If you don't ask, it'd kind of be, you just turn it around on them, and ask them for what you actually want, since they're already in your inbox and taking up your time.

Max: Yeah, so how does this course work? Who is it, is it an online course in the evenings or something that has one sign up for it?

Anne: Yeah, so it's a four week live small group coaching program, it's once a week on Wednesdays at 7pm Eastern. Each class is an hour and a half to two hours, which might sound a really long time but the material is actually less than one hour. It's really the questions and conversations that happen from that material that really expand it because everyone's really learning together and everyone is at different points in their career where I have people where they're, I've been working at the same company for 11 years and then I have people were, they're mid-level they've been at their current company for two years and they're, I'm getting out because it's toxic right? So it's it's that once a week thing and we just do it live on Zoom and then record for people to rewatch.

Max: Cool. Yeah. I'm thinking more in terms of a link, can people get it on their, on your website? I will send it out, put it on the link for the show notes page.

Anne: Yeah. So you can go to attractdreamjob.com and that will take you to my page. I'm in the process of updating some things but that's where you can find my information now, attractdreamjob.com.

Max: All right, cool. So great. Great. I'll encourage people to check that out. So machine learning, you are what's been your kind of experience in the, I hate the word the ML space but I'm about to say anyway. Yeah.

Anne: So my mind is really, you know, is a combination of classes I've taken combined with working with machine learning engineers, and some of my jobs, combined with the research I've done, and also doing talks and teaching about it, right. And also in the context of a product manager. And because for me—because I, again, my first language, I learned C++, so it makes everything else really easy. So for me, when I was really interested in the space, I was, let me take some classes, learn how to do this, understand what's, how this works, not just conceptually, but get some hands on experience, right? And then getting to work with, engineers in the context of being a product manager. 

And again, having a background originally in engineering gives me a bit of a different perspective than just someone who's maybe coming in from a non-technical space, but also being someone where, you know, the product manager is always interested in the why. And also, I'm someone who's always interested in product inclusivity and product ethics, or just tech ethics, is also thinking about,, you know, it's really great, we can build this model, but, what does this mean for the people? 

There's a lot of good things that can happen. But also, how might this be turned around and can be accidentally, you know, how could they accidentally harm people? Or maybe, how can this be abused? Right? There's somebody I don't remember her last name. Her name is Ava, she actually does really amazing research on people who actually use scenarios where technology has actually been used in domestic abuse. And so that's some of that. So those are some of the things that, I research, that I talk about. Specifically, also, we'll just talk about AI applications in general, and, how specifically—what are the problems they’re trying to solve, and why AI is the best tool for those specific scenarios to solve those problems.

Max: Yeah, so I want to dive into both those questions. I think the first one is just the question of why, which, in a particular context is, why are we building this? Should we be building this? Not from an ethics perspective, maybe just from, a business or resource usage perspective? Is this just something—Is this just a parlor trick, a fun thing that we're building? Or is this actually useful? And then of course, the secondary problem of, can it be turned to evil? Or can it lead to bad outcomes? So let's start with one of them. Let's start with the, let's just start with the one, why are we building this?

Anne: The bigger why.

Max: Yeah, the bigger why.

Anne: What is this all for? 

Max: Yeah, as a product manager, what applications of AI do you find particularly compelling? And where have you seen maybe people fall into some rabbit holes maybe, where they built some stuff that didn't quite work out as well as you'd hoped.

Anne: Yeah, so I wasn’t like oh, there's so many occasions. I was saying, yeah, so I would say, the thing that is, we'll talk about, maybe one of the most common applications, because I think a lot of people will be able to identify with it. And whether it's helpful or not, it is trying to solve a problem. And that is the ad space, right? Where, when it is something that you actually want, you're, wow, this actually is solving a problem. I actually did this, and I actually purchased it. And there are scenarios where, oh, this is actually you're...

Max: You’re saying if you find you're saying if you find a targeted ad? It’s actually useful? 

Anne: If you find a targeted ad, it's actually useful, right? And there's this problem of, obviously, it's before, it's you used to take out a page in a newspaper, because you knew that the New York Times had this demographic and this kind of thing. And, you know, had so many people in these areas, and that that was it. And now they can say, well, okay, yeah, we know that this is their demographic. But what if you knew that this person was 34 to 45, and they live in New York City, and you can get targeted from there. Right? That is you can actually make sure that your ad is actually better targeted and you're not necessarily having this thing where you're getting sent, basically, your message blasted across a lot of people which one of the things for the course that I learned is that career things on on social media, advertising, social media is actually terrible, at least for what I'm doing. Because basically Facebook got in trouble for discriminating against people of color and women for job ads. And so now you have to enter this very general scope of audience. So those...

Max: Wait. How did they—yeah, in what way did they get in trouble? What were they doing?

Anne: They were basically making it so that people could say, “I want white men from ages 28 to 37 to be able to see this ad.” And then you would exclude certain segments, which if you're maybe selling, I don't know, some rock band tee, maybe that makes sense. But when you're starting to talk about jobs, and you start talking about employment discrimination in the United States, now there's a problem, right? So that's something where Facebook was like, “Oh, crap, we legally got in trouble for this, even though all we did was just let this person run the ad, the person is the person who technically put up these things.” 

But now it's—for example, if I say, well, okay, I'm a Black woman, I want to target Black women for my coaching program. I actually can't do that, because my coaching program is career focused, they won't let me do it. So I get entered into this thing that's this mixture of an audience. And also the other thing is, is because I can't target it more, I get people who are...

Max: I have a complaint.

Anne: Yeah.

Max: I haven't been able to, when I first started the podcast, I tried to run a few ads on Facebook, nothing huge. But I ran into similar problems, they just would not approve any of my ads for dumb reasons. And I gave up.

Anne: Yeah, and when I first started running it, basically, because of this weird segment, they put me in, that's just a mixture of everybody. I basically got people who can't afford the course, and are really just desperate for any job at all, which is not necessarily my demographic, because I couldn't hone down on who my segment was. And that's kind of the thing that happens when it's like, before we didn't have this option, right? It didn't work for the person who was selling it, but they didn't have any other choice, right? It was better than not advertising at all. But that conversion rate was a little different.

But when you actually can hone down on things, the right people actually can be helped because they get the actual thing that they need, right, the problem they want to be solved. And then it also works for the business owner. But here it comes to this whole, it kind of does feed into the ethics thing, because it does take a lot of data, maybe not as much as they are collecting but does take a lot of data for these companies collect to be able to hone down on those things in the first place. Even if you put aside the weird segment that Facebook tried to put me in for advertising for my coaching program.

Max: Those segment things could get crazy. I've done a lot of work with that. And the AI ethics field is, well, it's a mess. I could wrap my head around stuff that I've been asked to do for work. And I'm like, okay, I don't think this should exists, so let's not work on it. Or I could wrap my head around, okay, we're building; well, a good example is way back in the day, when I built the Foursquare ratings, I'd be like okay, I want all the businesses to be treated fairly, what would I be worried about if I'm a small business treated unfairly? And how can fix that when I design that, but when you look at what's going on in Google, and all those, all those firings, slash resignations, and all the stuff they're arguing about, I don't get what's going on over there. I feel it's just it's I sometimes I feel like it's just Debate Club and it's not real life. But I don't know, what do you see there as the most important issues these days?

Anne: I mean, what is the most important issues as well?

Max: I mean, let's not hold on to the way I asked that. Let's not make it the most important because that puts you on too much pressure. But what are you thinking about these days? Let's put it that way.

Anne: I mean, really, I think one of the things I think a lot about, and I said before, is this inclusivity. And it's not just who is the product for and making it all inclusive, but also who is in the room and who actually has the ability to speak up and say, I actually think this. Or is actually able to contribute their ideas, and actually have those ideas heard and implemented. And I think that that piece of it, right, because it's—and I've said this on another podcast, but I'll repeat it here where it's, you know, it's kind of problematic. If you're like, well, let's think about if I was a Black if you're not a Black person, let's think about if I was a Black person, what would I experience as this? Or that was an Asian American person? How would I experience it?

Max: I just did that to small businesses, as I said before, yeah.

Anne: And there's an extent to which, you know, surface level, but when long term you have no strategy of how do you incorporate this? Are you gonna—if you can't, forever, you're not doing the hiring? Are you paying groups of people? Are you—is this part of your process? So it's not necessarily feels like  you’re dragging business and throws it out the window? Is it something that is… In a retention program, if you notice your employees, disabilities, or people of color, tend to stay not as long. They're those kinds of things, what specifically are you doing, because if people don't have a seat at the table, equal voice at that table, then you're just having a bunch of the same people, guess based on things they read and heard on the internet, but it's different, where it's, you can't possibly read all the things that person experiences in a single day. If you were trying, but I said, “Hey, Max, explain to me your entire day, and everything you perceived and happened to you today,” you'd probably just be like, “I'm exhausted, I don't even want to do that.” And that's the thing is, there's things where we all have our own limitations in which we can actually design for other people. We think about, I know this is an overused term, design thinking and that kind of thing. We have limitations when we're trying to do that, and how we're trying to design for others. 

And that's, and that's one of the things is, you know, funnily enough, I think some of this Google stuff was, whether it was over that or just tangential to this paper that I think was how these larger language models can be harmful to certain people and that kind of thing and exclude people, you know. That was brought to something—she is the daughter of immigrants from I think it's Ethiopia. Right? And you know, she's a Black woman, she's leading one of the biggest kind of AI research departments in the world, or at least AI ethics and that kind of thing. And yeah, it's it's such a big, it's such, I'll say that is that that whole thing is such a mess, and you realize you're, that is what is one second, that she, that you had somebody who actually called this out who would actually understand this, because, you know, English is not necessarily her parents’ first language. And even though I'm sure that they, I don't necessarily–

Max: I don't think it was hers either.

Anne: Yeah, so that's the thing is, when you think about it, she would be someone who's an authority on this, and you don't even want to listen to her, and then you bully her out of the organization. It's, you know, how are you supposed to build… How are you supposed to build ethical anything, when it's even somebody who would be a good person to tell you, “Hey, this is how this is gonna hurt people on her people like me or her people, like my parents.” And you're so caught up in your own organizational politics or threatened by whatever it was in the paper, or how she did whatever she did. This is, it's a very public way to be biased and messy.

Max: So I think everything you said before applies to all product development, really, not just not just AI. But is there anything that you could say AI specific, because I feel there's a lot of, I don't know, when you work on a machine learning project specifically, there are always a lot of unknowns, because you're not really in charge of everything that happens when you design such a system.

Anne: But one of the things that, so she's also a data scientist is Ayodele Odubela. She specifically talks about, even just the data sets that you're choosing to use, right? You're able to understand the data that you start with, for example a lot of facial recognition is mostly trained on, I mean reality is white men. Once you go down to women, it's even less accurate, once you start getting the people of color even less accurate. And then basically, if you have a very dark woman, it is wildly inaccurate. 

And these are things where if you understand the data set that you start with is biased, and you, somebody has the option to be, do we need to create a dataset, do we need to go out and get different people's faces? There's one that already exists and we need to purchase it or access it somehow? Or is it that—this is a thing where you need to analyze it. And there's a way to approach it to take out some of those things. Because there's always going to be some level of bias, but not considering that, right. And even that initial data set, because it's really it's—machine learning, learning. 

Max: Yeah, it's not just, are you learning it well, it's what are you teaching it? And it's not, “Hey I might be ‘the data’ is ‘teaching’ the machine the wrong thing.” It's just that if it doesn't have enough examples for certain situations, certain populations, it's not going to work as well.

Anne: Exactly. And then the other problem that you see sometimes is also no way to try to help the model. And this is obviously not true for all models. But sometimes they build things where there's no way to give the model feedback to help nudge it—if it gets too far in one direction—nudge it in a different direction. And that is also a problem. Right? Where if it's saying, this is this, for example, if Google were to take an image with something offensive, you need to have a way to report that and say, actually, this is not this. And this is actually harmful or biased or something like this.

Max: As an aside, I just have— so we have an Amazon system, tagging our photos that are coming in on Foursquare, and some of the ones that it tags as offensive are hilarious, or just basically inappropriate or totally just fine. Let me see if I could find an example here. Hold on...

Anne: I can also talk through some of the other things where it's, because some of these systems, there's no way they start with the biased data, and then there's not really a way to change or give new, basically say, “Oh, yeah, this didn't work out.” When you think about that compass, recidivism rate model.

Max: Did I just say it is, the last one that came in as remember, there's, it's a picture of a dog who's on his back, you rub my belly and it says, photo rejected subject, suggestive, bare-chested male?

Anne: Oh, my gosh, I mean, you get so puppy is a bare chested male, but not I technically, the way most humans would perceive that offensive. And that's the thing is, how do you give that feedback? Right, and that's a relatively harmless example. But we can see how quickly those things can go on to get out of control. And when you think about also how many algorithms are being used, you know, in our justice system, and in our prison system? And you think that one example people use a lot is that compass, recidivism model, right? Is that when it's you a lot of times, people were just looking at the output, and they were, Oh, this is actual, yeah, that person is more likely to commit a crime. But what they found, it was heavily biased against Black or brown people. But if it actually was being used, in combination with a human, the outcomes were actually better and less biased. But here's the thing, right, is that I'm not saying the compass algorithm should get to a point where it should be used completely by itself. But like, why is that–

Max: Hold on, why is that? So then can you explain what compass is?

Anne: Yeah. So there are companies and they produce the software that tries to predict recidivism. Recidivism is the likelihood that somebody is likely to recommit a crime. So they've been convicted of a crime, and they're trying to determine, what should go on? Should this person get bail? What, what are the next steps? So this can actually mean the difference between somebody getting to go back home with their family, and having to stay in prison? Prison is a traumatizing place, especially when you think about how many people are in prison for, you know, relatively minor offenses. And you even think about now during COVID-19, is that a place where you want? I mean, you don't want to be there before, you definitely don't want to be there now. And so that model, basically, they found that because it's taking in data from our justice system, which we already know, is not the most unbiased place to get data from, but where are you going to get data? There's not an alternative universe that has a fair justice system for the United States. You can get data from, and there'll be I know, there could be things where people could argue what you could do to it to fix that.

But here's the other thing is there's also on the other side of it. There's not there wasn't any input to say, “Yeah, we had a human review this and a human disagreed with what the model said. And here's our feedback of—this was an accurate assessment versus this was an inaccurate assessment.” And those are things so, the model just kept being biased and inaccurate and so it's kind of the data that you feed it on the front end, you have to think about, what that is. And then also, how do you help it correct course along the way, right? So that's another piece of it, especially right now, there's a lot of things where they still relatively need human intervention. 

And even when you think about one example that is going on in AI, they actually it's a great example, I didn't, it wasn't my first thought in my head. But it's what they're doing in medicine, right? Because this is changing a lot of health care, because in combination of AI, looking at these images and deciding, do you have cancer, do you have a heart problem? It's actually identifying these things way earlier, especially when you start thinking about things like breast cancer, that's something you want to find as early as possible. But it can in combination, using the model with an actual human doctor, you can find this way earlier and a lot higher accuracy. But the thing is that's something where if something started going awry with the model, you wouldn't want all these women finding out saying, “Oh, you have breast cancer.” And then they're stressed out, there's a lot of stuff that comes with it, there's additional doctor's appointments. So that's additional medical costs. And then it turns out, they don't have it. I guess that's a relief you don't have it. But you know, that's, it's a really stressful thing. And I'm not because of an AI model. 

But that's something that happened to my mom, my mom is a breast cancer survivor, she was actually wrongly told that something weird was going on with her breasts years after she went into remission. And turns out it was nothing. But you know, it's something where she's like, “I don't want to go through this ever again.” Her mom, my grandmother died of breast cancer, her sister got breast cancer twice, lived both times. But these are things where it's not just, “Oh, well,” this is people's lives. And then the worst part of this is, okay, what about the flip side? People who have breast cancer being told, you don't have it? You're good. And so this is why, you know, not just using this along with humans. But if there is a need to course correct, how do you give feedback to this model? How do you give it the actual outcomes? How does it know that that was accurate? And are in some of these, they're not, there's not even a feedback loop? Right? It just spits out an answer. People believe it. And Cathy O'Neill talks about this in Weapons of Math Destruction, where people are very quick to be, the algorithm is smarter than me. So I'm just not going to question the answer it gives me you know, and that's a dangerous place to be.

Max: Hmm. Yeah, I think that machine learning engineers, data mining data science, you're, you're taught to kind of build in that feedback loop. But then, in practice, it's often you know, that's the, those are the corners that get cut.

Anne: Yeah, and I think a lot of that is, and it's not, because this is also why I think it's important as a product manager, as a non-developer to speak about this type of thing. Because even though people speak about it as an industry thing, I feel, and maybe I'm wrong about this, I hear so much… Talking about, you know, the engineers, and I think there is the engineers need to understand this stuff and need to learn stuff. And there's people were like, you know, you should take sociology classes and that sort of thing. I think all that's important.

But here's the thing. If the product managers who are leading this don't know a thing about tech ethics, don't understand a thing about the data coming in and data out, things that actually that don't understand, “Oh, your zip code could actually be a really good indicator of race, socio-economic status,” they don't know these things. And they're like, well, this is going to help us be able to target x, y, z, and we're gonna make this much money and they're selling this into the VP of whatever, right? And that's not that organization's priority, whether it's because they don't know or they just don't care. This is a thing where I think people who are not engineers aren't being talked to enough about, here are the things that you need to know, that you need to be aware of. And actually at my day job, we just wrapped up Black History Month programming, and as part of our programming, we did a book club for Algorithms of Oppression by Safiya Noble. And the thing that's always interesting when we do book club for Black History Month, is just the learnings that come with it. 

And, you know, we had a lot of people who weren't engineers who were part of that book club and a lot of people who were not Black or people of color, and they were like, “Are you kidding me? This is a real thing that happens and search engines can do this?” And that's the thing is we have people where, you know, they work in tech, even if they're being in tech is not necessarily at Google or at Facebook, where maybe they're going to end up at Google or Facebook one day, or they're going to end up at a small little start-up that doesn't have the resources to send people to training about this, right. And these people aren't being taught to that. And they're going to tell the engineer, “This is what we need you to do.” And it's not that I'm trying to take the onus off the engineer, but I'm saying as a product manager, it's really important for me to bring these topics up, because I feel, this is BS., they're, I feel I've talked to people who are not engineers, and they're, “I've never heard of this in my entire life.”

Max: Hmm, yeah, I have seen engineers push back on things, although they often times forget that that's an option. But yeah, all right. Let's, let's see, if we have a few minutes left. Maybe we could talk about blockchain. I'm kind of curious to get your take on that. I notice on your website, you write blockchain and not Bitcoin, but what? So what's your interest in that area?

Anne: Yeah, so I actually worked in the blockchain space for a bit, I worked at a startup in the legal tech space, specifically part of the consensus ecosystem. And we worked specifically, basically building a lot of the protocols and infrastructure for Etherium smart contracts, which for those of you who are, what the heck did she just say? So in the theory, we did a–

Max: Hold on, I have an episode on that. And Jeff, well, I just like to, it's a little bit tough because of the delay. But in Episode 5, I talked about Etherium smart contracts. So I just I to pepper in the

Anne: Excellent. Yes, please go back. Go ahead. Episode. Please go listen to that. I highly recommend it.

Max: It’s two years ago already.

Anne: Yeah. Because it'll also make whatever I'd say next also make more sense. Whereas, the short version, is...

Max: Give us the two sentences. Yeah.

Anne: Yeah, the two sentences, basically, these are automated contracts where they can basically take the terms of a contract, because if you look at contracts, right, a lot of them are if-then statements, right? So you look at, if this happens, this is what we're going to do. If that happens, then this is what we're going to do. And basically, it turns this contract into this executable code that basically says, especially for things where you can pay automatically in crypto, they're “Yes,” if I press this button, saying that Max sent me, this box of chocolates that my crypto that's been in escrow as part of this smart contract is now going to automatically go to Max, right. And there's a whole contract where, you know, we didn't need to go through a bank, it could be directly through crypto, and it could automatically execute itself based on the terms and conditions and what's in there.

And so there's certain things where people who are— Etherium was built really to have apps built on top of it, whereas something like Bitcoin was built really as a currency. And obviously, Ether is a currency as well. But really, Etherium has all these decentralized apps that people build on it. And so those rely on smart contracts, and some people are like, I want to focus on, I know I have to know about the blockchain, right? I have to have blockchain developers, but I want to focus on building my decentralized app. I'm not trying to build a whole protocol to talk to a smart contract. So that's the space that I worked in, but I had to work with a lot of other cool people and other startups that were adjacent to us and learn a lot about the blockchain space. 

And so even though I am interested in crypto and that kind of thing, I'm also the kind of person to say, wait, but over here, you want to understand why blockchain is important to crypto and specifically what you can do besides Bitcoin with blockchain, right? Because there's a lot of applications and one of my favorite things that I've seen, there's this company called AgriLedger, where they're basically using smart contracts to record things on the blockchain as these mangoes get sent from Haiti to the US, into grocery stores, right? Because you think about our produce. We don't really know a lot about how it got here. 

We could, I could draw you a picture of how it got from a farm to here. But there's no one that's actually really recording through the whole process that has a whole bird's eye view of this is really how much the farmer got paid and what date this kind of thing. There's no real record of once this left the farm, you when did it leave, what date, time did it leave the farm? What type of transport was it in? When it was in the big shipping container at the airport, what was the temperature of the mangoes? What was the temperature of that container? Because, do you want something that's sitting in the sun? If the air conditioning fails, if basically they do things where it's, if there's a sensor, basically, that goes off and says, “Hey, the air conditioning is broken, and it's been sitting at this temperature for this amount of time,” it has a sensor go off. So you can actually track these things. And it basically creates this blockchain record of all these things that number one also end up helping the farmer actually get paid fairly. Because today the way it works is if you go to a farmer in some of these places, they don't necessarily know they're going to see that person ever again, as a farmer, or the farmer doesn't know, they're going to see the person who's trying to buy their mangoes or whatever ever again. So they just are willing to take a really, really, really small portion of what they're worth, in order to just get paid today. 

But with the blockchain and all these other records, they're all these things that are actually timestamp saying, this is what happened, this is what they agreed to, that make it so that okay, now there's this legally enforceable thing and I would be able to have recourse if this person never shows up again with my money. So they get a little bit upfront, but then they get more later. So and then it tracks everything. And when you, as the grocery store trying to buy these mangoes for your customers, you have the confidence that these mangoes aren't going to go bad the day after you finally put them out. And now you have a bunch of mangoes you have to sell for less than you bought them for that are going to go bad and your customers aren't going to want to buy them. You just wasted a ton of money on these mangoes, right? And so that's one of my favorite applications of blockchain that has nothing to do with crypto.

Max: Are these things actually deployed now? And in any scale? Because you know, it sounds really cool. But it's, how much is this stuff actually being used?

Anne: Yeah, so it's early days. So these things are being deployed at the scale like it's not every single grocery store is signed on yet, I would say in the blockchain space, where I compare it a little bit to the early days of AI, where it's like we went through a few AI winters, where you know, that 2018 crypto crash even though crypto was crypto, and it's only one application of blockchain, it really cooled everybody's interest in blockchain in general. Because crypto was the fun thing, because it's money. And that also went for people's excitement about these applications. So I'm starting to see more companies actually purchase and adopt these things. 

But are we talking on a large scale? I think the largest scale we've seen is that Walmart has been doing more with the blockchain because they realize that they actually, a lot of these companies they were buying things from to put on their shelves, those companies actually had no idea how their stuff got to Walmart. And so they actually, Walmart actually started putting things on the blockchain to try to track and so that's actually one of the biggest at-scale things, but it's through IBM, right, which is, I won't get into the whole argument of public versus private blockchain on here. I'm sure Max, you have another podcast about it. Or you can talk to me, me or someone else later about public versus private blockchains.

Max: I don't have public-private blockchain on but I think it would be good to mention, you know, one of the big arguments is not just public and private blockchain, but do we want a blockchain? Or do we, can we just do this with our database technology? People come up with, or people try to sell you all these applications, and it's, I think the industry is still trying to figure out okay, what's something that really helps us, and what's just a really complicated as database?

Anne: Yeah, and so actually, so I actually taught a course on O'Reilly called business applications of blockchain. And that's one of the things I talk about is, does your, does this actually need to be on the blockchain? Because there are several options where you could just have a very complicated secure database for this. You don't need the blockchain, right? And if you're doing something where, Bitcoin makes sense, because they're trying to do this whole trustless thing, and they don't want to have anything centralized. They don't trust the banks. They don't trust the corporations. There's other things where people, there's people who would argue, you know, if you need a private blockchain like that IBM is using, they're, you just need a database, and you can debate depending on specific project, you know, whether you need that or not. But that's the thing is there are, I think there's less so now, but especially around 2017 to 2018. Literally anybody and everybody was just saying, yeah, we use blockchain technology, because you could just get funding. People were just throwing money at it. I think there's slightly less now than there was then because the fervor around that time was so high. But then after a few too many people got scammed by IPOs, I think we were losing nine, $9 million—possibly billion. I'll check on that. But I think at least $9 million a day to IPO scams in 2017. And so I think people are a lot more wary about it now and they're starting to actually... 

Max: The ICO.

Anne: The ICO, sorry. My brain is switching contexts or changing—ICO scams. And so I think now there's still people who are selling and things that really don't necessarily need to be on the blockchain, but they're like, “I'm excited about this technology and I'm doing stuff.” And I think, you know, there's probably things in the early days of AI that for where AI was at that time, probably didn't need to be solved with AI, with what AI could do at that time. But it helped move things forward. Right. 

So I think there are some of these projects that probably don't use blockchain. But once some of these things are more robust, or at least faster, because especially when we talk about public blockchains, like this consensus model, where basically verifies all the transactions, it's very slow. We're talking about distributed systems, everything has—all the nodes have to agree, right. So something that is much slower. And when you think about how fast our internet is today, people don't like waiting around for the computer to agree. Oh, this transaction went through, or this finally happened, right. So there's the average consumer, it's not I would say it's not ready for the average-average consumer, but the early adopters are clearly very excited.

Max: Right? Well, the early adopters are certainly getting in on. Well, uh, you know, people have been in Bitcoin for, for a while, it's kind of going mainstream. I mean, maybe not mainstream in terms of people buying and selling with it, but it's in more people's portfolios, maybe directly or either indirectly through a company that you own. But, you know, I did an episode recently about NFT's, the non fungible tokens, and I'm still doing research on it. I asked a lot of questions about whether these things have any use, and I suspect, some of them do and some of them don't. Because some of these questions are tough to answer. Hey, are you really gonna get paid every time this piece of art is used? And you know, there's no guarantee on the blockchain that's gonna happen. Bitcoin has a guarantee, but that doesn't, that's just gonna be you just kind of have to trust people. So have you looked into those at all?

Anne: I've looked a little bit into them. I'll be honest, I can't speak a lot to them. I would if for people who are listening who are interested in learning more, Tonya Evans, she's a professor, I think in that Dickinson School of Law, UPenn, she has several conversations you can find on the internet, specifically talking about NFTs. I know the Kings of Leon just released, they're gonna have an NFT album, which I haven't looked too deep into how that is specifically going to work. But it is something I've actually talked to Tonya Evans about recently, in terms of, there are uses, I think, especially when we talk about certain things. When we talk about, is there a digital representation of something like the Mona Lisa? That makes sense.

But in terms of wider adoption, I think it will be a thing, but will it be taking over every thing that is a unique thing, like every album or that kind of thing? I don't necessarily know because here's the thing. I've seen startups that want to focus on solving the problem of getting artists paid using the blockchain and seeing, every time somebody does this, they make sure the artist gets paid. And you can cut out some of the big music business companies or recording companies. And the thing is, is that again, we're talking about a public blockchain. It's slower than normal. Maybe the music isn't necessarily the thing on the blockchain, it's just, “Hey, you access this album.” But there's things where it's gonna have to register that, oh, okay, this person is accessing the NFT or trying to transact. 

And does the end, does the average consumer say they want artists get paid. But does the average consumer really care that much about that, right? The average consumer’s real problem is that they are just trying to listen to music and they want it now and they want music that is the music they like, right? And so if Spotify tomorrow said, “Yeah, we're gonna change everything over into this blockchain system, and we're gonna make sure all the artists get paid.” I think people would like the sound of it, it'd be really cool for marketing, but in terms of actually using the actual product, I'm not sure they actually care. And that's one of the things that's going on now. Or blockchain sounds really cool, and it might be a cool marketing tactic. But the reality is you have to ask yourself, do your specific customers and users care about that thing, and/or if they don't care, does this give you some sort of advantage in another way? It doesn't matter that your customers don't care because it actually makes things work better or run better or solve some other problem that maybe the customer doesn't see—that happens behind the scenes the customer doesn't see, they just know that it works really well. 

And I think that's the thing is, as a product manager, that's what I think about is, does my specific customer care about this thing? And that's the other problem with some of these blockchain startups is, some of the things especially when they're more consumer facing, you're going to have trouble scaling at the rate that VCs and some of these other investors want you to scale, because you don't have enough customers that necessarily care about the problem you're solving. Because your customers don't care about, “Oh, it's cool, because it's on blockchain.” Your customers don't care that I mean—I hate to say this, they don't care that the artist gets paid. They say they do. But when we look at the numbers, how many people use Spotify, versus doing other methods, or that kind of thing? People don't. Right. So I think that's one of the things I think maybe eventually we can get to a system where they're, “Oh, this works, and it's not inconvenient.” But until we get to a place where things are robust enough where it works, it's not slow, it's not inconvenient. It's just part of the thing that makes it better, right, then they'll care more. It's kind of AI, 10 years ago, right? I think some people knew this, but a lot of your average consumer probably didn't think about it.

Max: Oh, that was when I first got into it.

Anne: Yeah, yeah. Right, whereas your average person probably wasn't thinking, “Oh, AI is gonna help me find all the coolest music, right?” That's not a thing, where if you tried to market that 10 years ago, or be like, this is an advantage, and especially with where AI was 10 years ago, I don't know people would have cared that much. But obviously in the last X amount of years, being able to, Spotify has so many algorithms, right? And they have so many data scientists. And that's something where they can now say yeah, we not only can give you the music that you want, but we also if you need to advertise here, we have a really great idea of who these people are, what they want and the type of music demographic they listen to. And so those are things where it's some of this is not just it's not that I'm necessarily just criticizing, “Oh, this is a bad idea.” It's that, sometimes your idea is again, wrong place, wrong time. Or it's the right place, wrong time. You know, there's a lot of things to consider there.

Max: Yeah, well, that's the billion dollar question for basically content creation and making sure that that artists get compensated, what's the model for that? That would be a lot to get into. I think this is a good time to wrap up, but Anne, thanks for coming on the show. I could already tell from my notes, there are going to be a ton of links on the show notes page for this show. But tell me where we can find you, your website, and some of your courses and tell me if you have any last parting words after today's conversation?

Anne: Yes. So if you want to find me more on the AI blockchain side you can find me at www.annetgriffin.com, so that's A-N-N-E-T-G-R-I-F-F-I-N dot com. That's also where you can contact me. I'm also on Twitter, @annetgriffin. And if you're looking for my coaching program that basically helps you attract your dream job, you can go to attractdreamjob.com or follow me on basically Instagram or Twitter, @pivotgrowhustle.

Max: All right, and thanks for coming on the show today. 

Anne: Yeah, it was great being here. Thank you for having me.

Max: All right, that was great. Next week on The Local Maximum, I'm going to be talking to Peter McCormick of the What Bitcoin Did podcast, which is a very fun Bitcoin podcast that's out there these days, doing a lot of big work. And, definitely stay tuned for that if you want another Bitcoin discussion and blockchain discussion, as I'm sure you all do know. Even if you don't, it's gonna be great. So tune in next week. Have a great week, everyone. 

That's the show. To support The Local Maximum, sign up for exclusive content and our online community at www.maximum.locals.com. The Local Maximum is available wherever podcasts are found. If you want to keep up, remember to subscribe on your podcast app. Also, check out the website with show notes and additional materials at www.localmaxradio.com. If you want to contact me, the host, send an email to localmaxradio@gmail.com. Have a great week.

Episode 165 - Peter McCormack of the What Bitcoin Did Podcast

Episode 165 - Peter McCormack of the What Bitcoin Did Podcast

Episode 163 - Flying Cars, Causality, and Digital Art Storage Concerns

Episode 163 - Flying Cars, Causality, and Digital Art Storage Concerns

Powered by Squarespace