56: על ה״מוח הגדול״ שמנגיש מטריקות לעובדי החברה
התמלול לפרק זה נעשה באמצעות שירות AI, אם מצאת טעות, נשמח לקבל עדכון כאן
Speaker 1:
Ran Erez: Ben, thank you so much for being here.
Ben Erez: Yeah, thanks for having me. This is fun.
Ran Erez: Amazing. So for everyone who may not know you, maybe just introduce yourself a bit.
Ben Erez: Sure. So yeah, my name is Ben. I live in Brooklyn, New York with my, my wife and her three year old daughter and a eight year old golden retriever. Yeah, I started my career in finance. I got out very early and then I was a failed founder, but learned a lot and found my way into product management. I was the first PM at three different startups. I was a PM at Facebook, attentive. I was a founder. And then two years ago, I decided to do something different with my career and I kind of went off off the beaten trail and started what I might consider as a solopreneur journey started off thinking about doing fractional product work, like advising, consulting. And I did, I have done some of that over the last couple of years, but I would say the thing that really pulled me more than anything else that I've done over the last two years is product management interview. Interview preparation. I used to do this as a favor to people to help them get ready for interviews at places like meta. And then at some point, I helped someone land a very lucrative offer after spending about six hours helping them prepare. And they're like, you're an idiot for not charging money for this. So I ended up starting to charge and, and then that all led to a Maven course that I launched about a year and a half ago, coming up on, I guess like two years in April. And that Maven course paved the path to my copilot, which is what we're going to talk about today as a tool for helping my students. But ultimately, the impact of that expanded well outside of my course. And then I have a few other things I'm spending my time on these days. But I think, you know, it's all it's all in the general PM interview prep space.
Ran Erez: Yeah. Great. So you said throughout the lines like copilot, how do you define a copilot?
Ben Erez: Yeah. I mean, I think it's kind of it's one of those words that has a lot of different meanings to a lot of different people, kind of like agent. So I would say, let's just talk about what talk about it as a tool and then, you know, I can talk to you. I can tell you why I decided to call it a copilot, but, but what the tool does at its core is you could think of people using Claude, ChatGPT, Gemini, whichever LLM is their tool of choice. And a couple of years ago, actually, you know what? It was probably just a year and a half ago or so, we had Tal Raviv on my podcast and I got to know Tal and Tal was kind of just like telling everyone about Claude projects. And I'd never played with Claude projects before, but I started playing with it and I had a light bulb moment go off. And what the lightbulb moment was, is I had at that point been teaching my course for about 5 or 6 months, and I started to get a really good feel for what people were asking me about in my course. They were asking me for help filling out my templates. I did some I created these very popular and they're free, like interview templates for product and analytical thinking interviews. And people would make a copy and start working through a question from a question bank, and then they would fill it out and they would ask me, hey, like, can you give me feedback on how this submission is? And that would be very time consuming because it wasn't like a ten second thing. It was like, I had to understand the whole context of the exercise. So I was like, I need, I need a way to scale my ability to give feedback on those filled out submissions.
Ben Erez: And then the other problem was some people get very fixated on the way a question gets asked in the question bank. So like they see a flavor of a, of a way that an interviewer asked a question they've never seen before. And it kind of like can be kind of scary for them. So they're like, how would you tackle this? Like, how does this come back into your framework? So I found myself spending a lot of time helping, you know, getting on calls with people and being like, here's exactly how you kind of fit this into my framework. Don't be intimidated by the question. So I had these two big problems, and I was just spending like way too much time with students trying to answer that. And I knew that wasn't going to scale. That was like my light bulb moment for the co-pilot. I ended up calling it the co-pilot because what it is, is it's, it's like a practice co-pilot. When I tell people you're going to spend 20, 30 hours maybe getting ready for your meta interviews or, you know, your product since analytical thinking interviews at another company how are you going to use that time? Are you going to spend it just doing work on your own? Or are you going to spend that all on mocks? Are you going to spend it doing something else? Like what are you going to use that time for? So I tried to create a very opinionated path for how people could spend that time. And I generally tell people to start with a co-pilot when they're ready to get reps with the material, but before they do mock interviews.
Ran Erez: I think when we think about other PM's listening to this is we try to frame it as either a challenge of how do you scale yourself and how do you personalize your answers when you're not there? And when we think about this framework of personalization and scale, I think this is the initial questions when we need to ask, when we could think about maybe a copilot is right for me at the situation. I think today, what we're going to do is we're going to break down both through your use case, but also to expand it to other, let's call it Ben's playbook for a copilot to really understand the nitty gritty and everything that we need to think about when we think about launching a copilot to either scale ourselves, personalize ourself or both of them. Okay. Sounds good.
Ben Erez: Yeah, let's do it.
Ran Erez: Okay. So let's take your case study and let's let's start. You said it was about a year ago when you started it. Maybe walk us through the steps you took to to get it off the ground.
Ben Erez: Yeah. I would say the very first thing was just figuring out if it even works, right? So I wasn't thinking about marketing. I wasn't thinking about packaging or pricing. The original idea was I was wondering if I could build something that helped me do something as like an internal tool as an instructor. And I started with a product sense question. In hindsight, I probably would have started with analytical thinking. I think it's actually kind of a little easier to codify. Analytical thinking interviews in this way than product sense. But, but nonetheless, that's the first step is just play with it for yourself. Build something that you could use that adds value to you and gives you leverage for something that you find yourself doing a lot.
Ran Erez: Then you started working on cloud projects, right?
Ben Erez: Yeah, yeah.
Ran Erez: Okay. What was that looking like in the real, real time? What did you do?
Ben Erez: Yeah. So the cool thing about cloud about, I guess like AI tools at that point, cloud projects at that point is there weren't that many levers to play with, right? So you had, you had, you created a project and then there's like instructions and there's files. That's it. Okay, so that's basically all it was. And it was doing rag to kind of try to come up with answers. And so the first thing I tried to do is I just didn't put in any files. I didn't put in any instructions. I just asked an interview question.
Ran Erez: Okay.
Ben Erez: So I said, you know how would you, I think I might have asked like the most common question that I've seen over the years, which is something like how would you build a product for volunteering?
Ran Erez: Yeah, right.
Ben Erez: I may, I may have said you're a PM at meta. How would you build a product for volunteering or something like that? And I just kind of saw what it came up with. And obviously it jumped right into a solution like a.
Ran Erez: Pm, right?
Ben Erez: Like a bad PM. So I was kind of like, okay, like, I guess if I just it would have no, it would have no way of knowing that that's not the right way to do, to do this. So what if I go in the instructions and I say, hey, like, every chat in this project is going to be me asking you an interview question. And before you jump into solutions, I just want you to frame the context around why this matters and then, you know, walk me through who you might build this for and then tell me what their problems might be. And then, and then I want you to land on a solution at the end. So I just started very simple like that. And then I went back to the chat and Claude projects has this button called retry and you could hit retry with no changes. And what that does is it basically has the AI take another stab at generating the response, but it does pull the latest kind of instructions from the project. And then I got to see what it did, you know, with the latest instructions. And then I was like, okay, that's a little bit better, but this part is off. And I basically just did that like a few hundred times over the course of probably a couple of weeks, just to kind of get the output from the response for, for product sense and analytical thinking to be what I would kind of expect to see in a passing answer.
Ben Erez: So once I had that all up and running, I pinged a few people that I knew who were calibrated interviewers from meta. I'm like, hey, I built this thing for my course. I'm about to give it to my students. I don't want to take them off track. And lead them astray. Could you just like ask it a question, ask it this question, right? Or pick any product sense question that you like to ask in interviews and just like play the role of the interviewer and just if you agree with what it's saying, say yes every time it checks in with you. And if you think it's not making sense, ask it a question like you would ask any other candidate and then just like get let me know when you're done and let me know. Kind of like your general assessment of, you know, like, how was that? And one of the first responses I got, I got back was in a slightly different word. I'm not going to say who it was, but it was someone who was definitely someone I worked with at meta who was a calibrated interviewer, was not at meta anymore. He was like, hey, like, can I hire this copilot? Like, I feel like this would be like a passing score for me.
Ran Erez: Wow.
Ben Erez: Yeah. So
Ran Erez: Wow.
Ben Erez: So what happened after that is I gave it because this was all happening again over the course of, like, a two week period. So before I put it in the hands of my January cohort students, which was like a couple weeks out, I emailed all of my, my quote unquote alumni who took my, my July and my October cohorts. And I said, hey, like, this is free for you. I'm just giving this to you. It's like I took all your feedback. I listened to all the questions you had. I'm thinking ahead to my next cohort. Some of you are still in interview prep. Some of you already landed offers. That's cool. But like, regardless, it would be it would mean a lot to me if you could just like play with this and tell me how helpful is this? What could be better? So I was kind of like, wave two is like putting in the hands of the alumni from the course and getting their feedback.
Ran Erez: So I think here, if I highlight two things that are important is, first of all, you gave it to the interviewers essentially to make sure that the bar is high enough and then you gave it to the actual, let's call it the persona, who is going to actually use it to actually use it and see how this goes, and to get that essential feel for whether this gives personalization at scale. Right?
Ben Erez: Totally.
Ran Erez: Okay. And then and then what did your alumni say?
Ben Erez: I mean, it was, it was very positive, but I'm trying to kind of frame this in a way that doesn't sound like I was just like, I genuinely was looking for what's wrong with it. And I did get like the overwhelming kind of feedback I got is like, this is incredibly helpful. And the like, I felt like the you know, the segmentation just felt a little kind of like shallow to me or it didn't really follow like what you said or it picked the supply side of the ecosystem. When I think you generally recommend for this kind of question, maybe to pick the demand side or when it got into problem identification, it didn't do like people are like, yeah, you always talk about how you should try to do a journey before you pick a problem. But the copilot wasn't doing a journey. It was just kind of going from like, you know a segment to like the problems. And so I actually want to get more practice with the journey piece etc.. So I was, I was starting to get kind of like specific pieces of feedback or like the mission statements were too verbose. So people were kind of like holding me accountable to like the stuff that I told them and they were like, look like I'm confused.
Ran Erez: Ben.
Ben Erez: I'm confused because you just gave me this tool and like, I'm trying to figure out whether what you told me is the source of truth or what the copilot is telling me is the source of truth. And that created an urgency for me to be like shit. Like, I can't put this in. I can't like, give this now to a bunch of new people in January if I'm going to contradict myself, like if there's going to be contradictions in the system. So that's when I kind of went into like intense tinkering mode for like phase two of tinkering, which is like, okay, now I'm going to literally go section by section. In both of these, I'm going to look at the last five questions and the question bank from Lewis Lin. I'm going to drop them into new chats, and I'm not going to basically like be done with this process until like until. Until there's no contradictions between what the copilot does and what I would do.
Ran Erez: But you didn't stop there, right? So even after you tweaked it a lot and did a lot of a lot of feedback, after your student's feedback, you continue to improve it, right? You had the ultimate test of making sure the copilot works.
Ben Erez: Yeah. And I'd say right around the time where I was comfortable giving it to my January students, that's when I felt comfortable increasing the price of the course to reflect, to like, actually put copilot in the name. So that allowed me to go from like $600 to $1000 for the course price. And I said, okay, like, what if someone doesn't want to take my course, but they want to get access to this tool? Do I want to let people get access to it? And I was like, okay, I'll sell it separately and then I'll just bundle it in the course. But for me to sell it separately, to your point, I did have to be. That was like my first time ever selling a digital product like this. So I didn't expect anyone to buy it, I just. But then it started to happen because I linked to it from the Maven course page, because I was like, hey, like, this thing is included in the price, right? So people would go and check it out. Maybe they didn't want the course, but they bought the copilot. And I told people, if they buy the copilot, then I'll just give them a discount in the amount of the copilot, which the original price was 2.99. So basically, like if you buy it, I'll give you a $300 promo code on the course. So you'll, you'll be made whole if you end up wanting the whole thing after.
Ran Erez: Okay, so when was the moment that like the copilot exploded?
Ben Erez: It was probably April when I had my April 2025, when I had my my Lenny guest post go live about product sense interviews because that linked to all my things that linked to my course, that linked to my templates, that linked to my copilot. And that also basically established my number one spot on, like, you know, SEO, if someone looks for product sense interview in my, my, my stuff just pops up at the top. Yeah, I think I did like $10,000 of copilot sales that that April. Wow. Yeah.
Ran Erez: Amazing. So we touched a lot of things about your case study, but I want to help our listeners maybe think about like Ben's playbook for building a copilot. And I want us to try and think about like the main principles that we talked about here and try to help our listeners think about their problems and whether copilot is good or bad for this situation. So if I had to ask you, what would you say would be like the biggest principles? You see?
Ben Erez: I would say the first one is to like, really sensitize yourself to what you find, what kind of advice or what kind of expertise do you find yourself using the most where you would be comfortable saying, I might be like the best person at explaining this thing, okay. Or there's other great people at explaining this thing. But for some reason, the way I explain it or the way I, you know, tackle it seems to really resonate with people. So trying to like, really sensitize yourself to where you might have a superpower of some kind. Okay. And the reason that's important is because everything we talked about relied on my ability to evaluate the quality of the output.
Ran Erez: Yeah.
Ben Erez: Okay. And if I was trying to do something that I was not an expert on, or that I wasn't extremely opinionated with real proof points, that my opinionated beliefs were backed by real results. Then I would be shooting in the dark, or I would have no ground to stand on as like a credible voice. And if you're just trying to build something for like for fun or just like, you know, as a hobbyist or you're not trying to like take a, like a high stakes thing like interview preparation, which is super high stakes, then, you know, maybe you could like loosen the requirements around that a bit. But if you, if you are trying to bring your expertise into some kind of packaging that people would pay for and maybe even command a premium price for, then I would really try to like narrow in on that thing that you're uniquely positioned to do. So I'd say like, once you know what that thing is, I would, I would try to break it down into like, what's the, what's the workflow in which that expertise fits? What's so for interview prep, it's generally like, you know, someone's getting ready for an interview. They may take my course, they may not take my course. They're trying to get ready. There's probably going to be mocks at some point. So where in that process does this thing fit and where does your tool fit? Do you want them to use? Use it during an activity before an activity after an activity. So trying to really be opinionated about how it might. Like when does it fit? Yeah, I would say that's probably step two.
Ran Erez: Okay. So step number one is like understanding what good looks like and the fact that the results that you are bringing as a human being actually produces great results in the real world. And that, you know, how to determine whether this is a bad answer or a good answer. That's step number one. Step number two is actually mapping the user journey of your users and understanding where exactly in this process does this fit? Like is there is this a specific step before, during, or after? And like really nail down like the step that you need copilot or you could use copilot, right?
Ben Erez: Yes. But let me clarify what I'm. Okay. So like the thing you build with a product like this, it, it's still, you could think of it as like a solution to a moment in time. You could think of it as a solution that allows people to do something repeatedly. If it's just a one time thing, don't build a tool for it, right? But, but if it's like a repeatable thing that they have to do a lot of times, then you need to understand when that thing happens. And then so the iteration you do like the step by step thing is not about the moments in time in which they use it. It's about when they need it. Breaking down the specific steps, like within the factory, it's like there's a conveyor belt and making sure you're showing every step of the conveyor belt. I would say if the thing that you are thinking about productizing in this way does not lend itself to like linear progression. Then that way of approaching the development will be different. Like, I probably don't have as much advice for someone who's trying to build something that does not lend itself to like a linear progression, because the way that the LMS work is like every, especially with interviews, every step builds on the previous step. So the solutions you come up with at the end of a product sense interview are built on top of the shoulders of the problems in the segments, etc.. So you can't really just, you know, I've had people, why can't I just get right into the solutions? I'm like, because in a real interview, you wouldn't be able to get right into the solutions you have to get, you have to work your way towards that.
Ran Erez: Okay. So I think that's really important. So we said, okay, understanding what goes looks like being able to actually build it bottom up and improve each step of the way. Understanding the user journey and exactly which point in time does this bring value. And also mapping that within this step. It's a linear process of progression, right?
Ben Erez: I think that's about right.
Ran Erez: Okay. What else do you think is really important to to think about when you think about a copilot like this?
Ben Erez: I think the scoping is a huge part of it. Like, like what we talked about earlier with like the, just trying to understand how much you're trying to bite off because, you know, I was teaching a course about both of these interviews, but breaking them down into separate ones. We talked about that. That's, that was an important thing. And then I would say like interviews kind of were a nice use case to get started with something like this because they have clear evaluation rubrics that, you know, you could help someone understand what good looks like or what weak looks like. But a lot of life doesn't have evaluation rubrics, right? So I think like great management in general, like being clear about expectations and being clear with what looks, what good looks like and what does what good does not look like or what bad looks like. And being able to articulate that in words, I think will be very important too. Like you can't. I don't think you can get anything out of this kind of product unless you also take the time to kind of create some quantifiable way for it to evaluate its own performance. Because that's the other thing I would do at the end of the. The copilot will also evaluate its own performance at the end of the interviews.
Ran Erez: Okay. So I think evaluation is another key principle is like the fact that there is some sort of feedback loop between iterations, and I think that's critical, right? It's like you don't just do it once. We said, if it's once, don't build the copilot, but you have to also, when you finish that linear progression to have clear feedback loop into the entire process.
Ben Erez: Yes. And then, and the other reason that helps is because I knew there was actually two kind of workflows I wanted to bite off. One was how, how do, how do I work through a question I've never seen before? Like so that was like the, the main one. But then also I just filled out this template, like what can you give me feedback on my filled out template? So to serve the second use case, I had to build the evaluation rubric. And then I realized that if I could solve the second use case with the evaluation rubric, that the natural performance I got for the first case was got significantly better because I had codified what the evaluation rubric looked like. So yeah, so my learning from that is you don't necessarily need to be building a product that has like a copilot that can evaluate some filled out template or something. But my learning from that was codifying the rubric for evaluation is something that would strengthen the main use case. The main workflow.
Ran Erez: Yeah. Great. So Ben, we are about to wrap up and I would like to ask you one final question is like, what would be like a key insight you want our listeners to stay with after hearing our conversation today?
Ben Erez: I think that people, a lot of people dramatically under undervalue how unique their knowledge is about, about specific things that they might be the experts on. And I think that in this new world we're going into, there's just like a lot of slop. And there's a lot of like, I think it's getting harder and harder to know what content to trust and what content is reliable. Yeah. And so my hope is that by talking more about my copilot, I don't want people just being like, oh, that's so cool. Like I, I mean, I think it's cool, but that's not the point of this. The point of this is I want people to be like, there's nothing particularly special about me. If I can do this, you can do it too. And, but I can't do the heavy lifting for, for you as, as a listener or as an audience member for what that thing should be and how you do it. That's like the hard part is obviously you could lead a horse to water, but you know, you can't make a drink. And so I, but I do want people to kind of potentially consider building a new type of product that they hadn't considered before. And just because you could vibe code software products now doesn't mean that that's like the best way to maybe package and monetize your knowledge.
Ben Erez: Maybe there's like a faster path to getting people to what they need. And then as a consumer of your knowledge, like if I'm looking for expertise on, you know, how to improve my website or how to improve conversion in my funnel or how to better how to like these guides that we're doing with insider loops. If someone has like spikes and they're like the top 0.001% when it comes to like information architecture and like guides and like written guides. And they could, I could like buy something from them that just like allowed us to level up. Level up our guides. Like there's so many things areas in my life and I'm sure for you too, where you're looking for like, what would the best person in the world do at this point with this thing? And I just don't think there's like a place to go and discover a lot of that right now. And I would certainly be a buyer for very specific things that fit into very specific workflows for very specific expertise. So that's kind of, I guess, like maybe we're moving into a world of kind of like micro micro expertise, insertion points. And that would be, that would be really cool if someone chose to build something out of this.
Ran Erez: Amazing to me is like when we think about whether I want to scale myself and personalize the answers, that's a good starting point. But the main takeaway from my perspective is resist the urge of building this top down. So don't go and create tons of instructions, but build this bottom up. Start with the baseline and then add gradually more instructions and make sure that you are controlling the the output. And I think that's key to getting a result that would be gold standard. And you can put your name on it. So Ben, thank you so much for this conversation. It was really, really interesting.
Ben Erez: Yeah. Thanks for having me. This was fun.
Ran Erez: And for everyone listening, thank you for listening. You can follow us on all of the podcast software to make sure that you know, when a new episode comes in. And again, thank you so much for your time and thank you everyone for listening.
הניוזלטר שלנו
הירשמו וקבלו עדכונים על פרקים חדשים, כתבות, אירועים ועוד הפתעות!
רוצים לקחת חלק בשיתוף ידע?
אם גם אתם רוצים להצטרף למשימה שלנו להעשיר את האקוסיסטם בידע ותובנות, אם אתם רוצים לשאול אותנו משהו, אם אתם מרגישים שיש משהו שעזר לכם וכולם צריכים לדעת, נשמח לשמוע.