Transcript Reader Lenny's Podcast
Library
Builder transcript 中文已完成

How to measure AI developer productivity in 2025 | Nicole Forsgren

Read the source conversation in a calm, mobile-friendly layout.

ChannelLenny's Podcast
Language中文
SourceYouTube
Coverage100%
0% 章节 01
Video Source How to measure AI developer productivity in 2025 | Nicole Forsgren

Lenny's Podcast

https://www.youtube.com/watch?v=SWcDfPVTizQ
Reading Mode

默认显示中文,缺失的章节会自动回退到英文原文,保证这页随时可读。

章节 01 / 08

第01节

中文 中文暂未完整,先显示英文原文

Lenny RachitskyA lot of companies are trying to measure productivity for their teams.

Nicole ForsgrenMost productivity metrics are a lie. If the goal is more lines of code, I can prompt something to write the longest piece of code ever. It's just too easy to gain that system.

Lenny RachitskyHow do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can?

Nicole ForsgrenMost teams can move faster. But faster for what? We can ship trash faster every single day. We need strategy and really smart decisions to know what to ship.

Lenny RachitskyOne of the biggest issues we're going to probably have with AI is learning how much to trust code that it generates.

Nicole ForsgrenWe can't just put in a command and guess something back and accept it. We really need to evaluate it. Are we seeing hallucinations? What's the reliability? Does it meet the style that we would typically write?

Lenny RachitskySo much of the time is now going to be spent reviewing code versus writing code.

Nicole ForsgrenThere's some real opportunity there to not just rethink workflows, but rethink how we structure our days and how we structure our work. Now, we can also make a 45-minute work block useful because getting into the flow is actually kind of handed off, at least, in part to the machine or the machine can help us get back into the flow by, reminding us of context and generating diagrams of the system.

Lenny RachitskyWhat's just one thing that you think an eng team, a product team can do this week, next week to get more done?

Nicole ForsgrenHonestly, I think the best thing you can do-

Lenny RachitskyToday, my guest is Nicole Forsgren. With so much talk about how AI is increasing developer productivity, more and more people are asking, "How do we measure this productivity gain? And are these AI tools actually helping us or hurting how our developers work?" Nicole has been at the forefront of this space longer than anyone. She created the most used frameworks for measuring developer experience called DORA and SPACE. She wrote the most important book in the space called Accelerate and is about to publish her newest book called Frictionless, which gives you a guide to helping your team move faster and do more in this emerging AI world. Her core thesis is that AI indeed accelerates coding. But developers aren't speeding up as much as you think because they still have to deal with broken builds and unreliable tools and processes, and a bunch of new bottlenecks that are emerging.

See what over 200,000 entrepreneurs love about Mercury. Visit mercury.com to apply online in 10 minutes. Mercury is a FinTech, not a bank. Banking services are provided through Mercury's FDIC-insured partner banks. For more details, check out the show notes. Here's a puzzle for you. What do OpenAI, Cursor, Perplexity, Vercel, FLAN, and hundreds of other winning companies have in common? The answer is they're all powered by today's sponsor, WorkOS. If you're building software for enterprises, you've probably felt the pain of integrating single sign-on, skim, RBAC, audited logs, and other features required by big customers. WorkOS turns those deal blockers into drop-in APIs with a modern developer platform built specifically for B2B SaaS.
Whether you're a seed-stage startup trying to land your first enterprise customer or a unicorn expanding globally, WorkOS is the fastest path to becoming enterprise-ready and unlocking growth. They're essentially Stripe for enterprise features. Visit workos.com to get started or just hit up their Slack support where they have real engineers in there, who answer your questions super fast. WorkOS allows you to build like the best with delightful APIs, comprehensive docs, and a smooth developer experience. Go to workos.com to make your app enterprise-ready today.
Nicole, thank you so much for being here and welcome to the podcast.

Nicole ForsgrenThank you. It's so good to be here.

Lenny RachitskyIt's so good to have you back. I was just watching our first episode, which we did two and a half years ago. I was watching it, and I was both shocked and not shocked that we barely talked about AI. The episode was called How to Measure and Improve Developer Productivity, and we got to AI barely like an hour in and we're just like, "Hmm, I wonder what's going to happen with AI and productivity." Does that just blow your mind?

Nicole ForsgrenYeah. Because it was just hitting the scene, it was the topic of so much conversation, and at the same time, so many things don't change. So many things are still important, so many things are the same. Yeah. It's also a little wild that it's been two and a half. Where does time go? Time is a social construct?

Lenny RachitskyYeah. Most of our conversation was just questions like, "Well, how might this impact people? How will we change the way we build product?" It was barely a thing back then. Now, it's the only thing that I imagine people want to talk about when they talk about engineering productivity. That's where we're going to be spending a lot of our time focusing on today. The reason I'm excited about this conversation, it feels like there's been so much money poured into AI tools increasing productivity. The fastest growing companies in the world are these engineering AI tools. And now, more and more people are just asking this question of just, "What gains are we getting out of this? How much is this actually helping us be more productive? How do we become more productive?"

You've been at the center of this world for longer than anyone. You've invented so many of the frameworks that people rely on now. So I'm really excited to have you back to talk about this stuff. I want to start with just this term DevEx, it's something that comes up a lot in this whole space, and we're going to hear this term a bunch in this conversation. Can you just explain what is DevEx, this term DevEx?

Nicole ForsgrenDevEx is developer experience. And when we think about developer experience, we're really talking about what it's like to build software, day to day, for a developer. So the friction that they face, the workflows that they have to go through, any support that they have. It's important because when DevEx is poor, everything else just isn't going to help. The best processes, the best tools, the best... whatever magic you have, if the DevEx is bad, everything kind of takes-

Lenny RachitskyWithin DevEx is productivity, and I think the key insight that you had and other folks in the space of that is not just productivity, but there's also engineering happiness. We're going to get into a lot of these parts, but just maybe speak to... there's productivity and there's broader components to engineers being successful at a company.

Nicole ForsgrenYeah. I love that point because productivity, first of all, is hard to define anyway. But if you're just looking at output, you can get there in a lot of different ways. But if you're getting there in ways that are high toil or high friction, then at some point, a developer is going to burn out. Or if it's super high cognitive load, if it's hard to even think about what you're doing because concentrating on the mechanics of... the plumbing of something, then you don't have the brain space left to come up with really innovative solutions and questions. So I love that it's kind of this self-reinforcing loop in terms of, "You do more work, you do better work." And it's better for people, it's better for the systems, it's better for our customers.

Lenny RachitskyI was going to get to this later, but I want to actually get to this right now, this idea of flow state for engineers. I was an engineer, actually, early in my career. I went to a school for computer science. I was an engineer for 10 years. The best part of the job for me was just this flow state you enter when you're coding and building, and just things feel like so fun. It feels like AI is making that harder in a lot of ways because there's all these agents you're working with now, there's all this code that's kind of being written for you. Talk about just the importance of flow state to a developer, happiness, developer productivity, and just what you've seen AI impacting. How you've seen AI impacting that?

Nicole ForsgrenWell, there are lots of different ways to talk about DevEx. One way to talk about it is kind of three key things that have components that are important of themselves, and they also kind of reinforce each other. Flow state is one of them, cognitive load is another, and then feedback loops are another. I think when you touch on this... Your question about flow state is a really good one, and I'll admit we're just a few years into this. We're still figuring out what the best flow state and cognitive requirements are for people in this because, to your point, sometimes we're getting interrupted all the time. You don't just get in the flow and lock down, and write a whole bunch of code and do the typing of a whole bunch of code as much anymore. Instead, you're kind of creating a prompt, getting some code back and reviewing the code, trying to integrate what's happening in the system, and that can really interrupt.

At the same time though, it can contribute to flow if... I've seen some senior engineers pull together some tool chains that are really incredible, where they figured out how to keep the flow going. The fast feedback loops really, really work well for them. They can kind of assign out different pieces to agents. It helps them keep in the flow in terms of... Instead of details and line-by-line writing, they're in the flow in terms of, "What's my goal? What are the pieces that I need to get there? How quickly can I get there? So then, I can step back and kind of evaluate everything, and then dive back in and fix some pieces."

Lenny RachitskyIs there anything more you could say about this engineer that figured out this really cool workflow, about just what that looks like?

Nicole ForsgrenI've spoken with a handful of them, and I've kind of watched them work. I haven't built it myself yet. It's on my list. They've been able to set up this really incredible workspace and workflow where... Right now, a lot of us play around with tools and... We'll put in a prompt and we'll get a few lines back or maybe we'll put in a prompt and we'll get whole programs back. Well, what they can do is they can... Many times I'll see them say, to help prime it, "This is what I want to build. It needs to have these basic architectural components. It needs to have this kind of a stack. It needs to follow this general workflow. Help me think that through," and it'll kind of design it for it. And then for each piece, it'll assign an agent to go work on each pace in parallel, and then it'll say and upfront, "These need to be able to work together, make sure it's architected correctly. Make sure we use appropriate APIs and conventions."

Then at the end, they can let it run for a few minutes. They can think through something else that's interesting or they anticipate is going to be hairy, and they come back to something that's probably a little better than vibe coded. Because they were so systematic about it upfront, they're much closer to something that looks like production code.

Lenny RachitskySo what I'm hearing is spending a little time upfront planning, what all these AI engineers are doing, versus just powering through and just figuring out as you go.

Nicole ForsgrenYeah.

Lenny RachitskyOkay, cool. Let me get to this quite a core question that I think on is a lot of people's minds. A lot of companies are trying to measure productivity for their teams, "Is this improving our productivity? Is this hurting our productivity?" So let me just start with this question, how are people doing this wrong currently when they try to measure their productivity gains with AI?

Nicole ForsgrenI'll say most productivity metrics are a lie. It's really tricky because, historically... Now, look, lines of code has always been a bad metric, but many folks still use lines of code-

Lenny Rachitsky.

Nicole Forsgren... yeah, as some proxy as some proxy for output or productivity or complexity or something. Well, now, for many of the systems, that they would sometimes whisper and not super talk about that uses lines of code, it's just blown out of the water because, "What do you mean by lines of code?" If the goal is more lines of code, I can prompt something to write the longest piece of code ever and add tons of comments. We know that agents and LLMs tend to be very verbose by definition, and so it's just too easy to gain that system and then introduce complexity and technical debt into all of the work that you're doing. I will say there are some things that we can kind of watch and pay attention to because... So lines of code as a productivity metric isn't great, it's pretty bad. But now, it's kind of more relevant if we can tease out which code came from people and which code came from AI because now we can answer downstream questions.

"What is the code survivability rate? What is the quality of our code? Is our code being fed back into trained systems? And for that code that's retraining systems later, especially if we're doing fine-tuning and local tuning, how much of that is machine generated? What types of loops is that creating, and what types of patterns or biases might it be inadvertently introducing?" On the one hand, it's not good as a productivity metric, but it can be useful. I'll even say the same for DORA. I have done DORA metrics, their speed metrics, their stability metrics. If that's all you're looking at, it's not going to be sufficient anymore because AI has now changed the way we think about feedback loops. They need to be much faster. Now, what DORA's meant for, kind of assessing the pipeline overall in terms of speed and stability. Still, that works. But we can't just blindly apply the existing metrics we've used before because we'll miss super important phenomenon and changes in the way people work.

Lenny RachitskyInteresting. You invented DORA, that was kind of the main framework people used for a long time to measure productivity. And then there's SPACE, there's Core 4, there's probably others. So what I'm hearing here is all these are kind of out of date now, where AI is contributing large portions of code.

Nicole ForsgrenI will say if it is a prescriptive metric, it needs to be used only in the way it was prescribed.

Lenny RachitskySo

Nicole ForsgrenDORA 4, there are four key metrics. There's two speed metrics, deployment frequency and lead time. So code commit to code deploy. There's stability metrics, MTTR and change fail rate. If those are used to assess the speed of the pipeline and the general performance of the pipeline, that's great. If you're trying to use those to understand... Because implied in that is feedback loops, right, because you used to kind of get feedback from customers. But we can't just use that blindly now when we're using AI, as an example, because we have feedback loops much earlier and not even just at the local build and test phase. We have feedback loops throughout, and even sometimes in the middle of some of the pipeline, that we really want to leverage in ways that weren't as useful before. I won't say they weren't possible, but we just didn't really focus there.

So those are prescriptive metrics. When we think about SPACE, SPACE is a framework. It doesn't tell you what metric to use. So I'll say, sometimes people get real frustrated because I didn't tell them what to measure. But now, I think that's the power of it. We're actually seeing that SPACE applies fairly well in these new emerging contexts like AI because we still want to look at... SPACE is an acronym. We still want to look at satisfaction. We still want to look at performance, what's the outcome. We still want to look at activity. Yes, in some ways, lines of code and number of PRs can be useful for something, or number of alerts or number of things, activities or counts. Seize communication and collaboration, this is also super important and useful because it's how our systems communicate with each other, and also how our people do. "What proportion of work is being offloaded to a chat bot versus talking to a senior engineer on the team?" More isn't always better and less isn't always better, it depends.
And then efficiency and flow, "Can people get in the flow? How much time does it take to do things? What is the flow like through our system?" Here, I would probably add a couple of dimensions. So chatting with some of the early authors to say trust. Not to say trust wasn't important before, but now it's very, very front of mind. Right? Before you build your code, if the compile comes back, you're fine. And that's the way it is. LLMs are non-deterministic. Right now, we can't just put in a command and guess something back and accept it. We really need to evaluate it, so, "Are we seeing hallucinations? What's the reliability? Does it meet the style that we would typically write? And if it doesn't meet, is that fine?" So it depends on... Prescriptive. You got to make sure you're using it fit for purpose. Right?

Lenny RachitskyWe're going to get to your current thinking on the best way to do this stuff. You have a book coming out that explains how to do this well, so we're going to get to that. One thing I wanted to highlight in our last chat that we had, you highlighted that one of the biggest issues we're going to probably have with AI is trust, understanding and learning how much to trust the code that it generates, and also how much... you said this, two and a half years ago, that so much of the time is now going to be spent reviewing code versus writing code. That's exactly what I'm hearing.

Nicole ForsgrenI think it'll be interesting to see how that impacts the way we structure work moving forward. We were talking about flow state and cognitive load. Now that our attention has to focus on things at certain times and it's broken up from how we used to do it, I think there's some real opportunity there to, not just rethink workflows, but rethink how we structure our days and how we structure our work.

Lenny RachitskyCan you say more about that? Just what is that? What are you thinking will be happening? Where do you think things go? What are you seeing working?

Nicole ForsgrenThis is purely speculative. But for example, Gloria Mark has done some really good work on attention and deep work, and humans can get about four hours of good deep work a day. That's about it.

Lenny RachitskyYeah,. I feel that.

Nicole ForsgrenThat's kind of the upper limit-ish for the most part, and I'm sure people are going to be like, "Well, I am superhuman and I can do-

Lenny RachitskyWhat if you take 20 grams of creatine?

Nicole ForsgrenRight. What if we microdose?

Lenny RachitskyYeah, exact;y.

Nicole ForsgrenYeah. So in the context of knowing we have about four hours of good deep work... I'm sure many of us have probably hit this, right? We have good periods. Maybe it's morning, maybe it's afternoon for folks. And then you hit a time where you're like, "I'm going to clean up my inbox because that is all I can do right now. I can be functional, but I'm not going to come up with my best innovative, problem solving, authoring, code writing work." A lot of times, the way to do that and to get into it is to have these long chunks to get into flow and to get that deep work. Usually, I'm two hours-ish. An hour can be tricky because it could take time to get into that state. Okay. Well, when we think about what it used to be like, back in the old days, three years ago, three and a half years ago, we could block off four hours of time and we could probably get two or three hours of really good work done. Because we were just focused, right? There were no interruptions, minimal interruptions.

Now, the nature of writing code and systems itself is interrupt driven or full of interruptions, at least, because you start something and then it interjects. So how do we think about that? Does that mean that a four-hour word block is still useful? Probably. But does that mean that now we can also make a 45-minute work block useful? Because getting into the flow is actually kind of handed off, at least, in part to the machine or the machine can help us get back into the flow by reminding us of context and generating diagrams of the system and all the things. So I think that's a really, really interesting area that's just ripe for questions and opportunity. And please, folks, do this research and come back to me because... It might not make my list, but it's such a great question.

Lenny RachitskyThat is so interesting. Essentially, every engineer is turning into an EM, engineering manager, coordinating all of these junior AI engineers. So your point is even if you have a 30-hour block, you can get deep into code, but you can unblock all these AI engineers that are running off doing tasks. Plus, your point is they remind you of just like, "Here's where you left off. Okay. You can just jump into this code, maybe make some tweaks."

Nicole ForsgrenYeah.

Lenny RachitskySo interesting. Let me zoom out a little bit and... Before we get into your framework for how to approach developer experience, the latest thinking you've got, beyond just obviously engineers doing more is great, what's your best pitch for why companies should really, really focus on developer experience?

Nicole ForsgrenI hate to say return of investment, but the business value is... the opportunity here is huge. In general, we write software for fun and for hobbies, but we also have software because it meets a business need. It helps us with market share, it helps us attract and retain customers, it helps us do all of these things. And I think DevEx is important because it enables all of that software creation, it enables all of that problem solving. It enables the super rapid experimentation with customers that... Before, you'd need a while for a prototype and maybe a little bit longer to actually flight it through an A/B test on a production system. You can do it in hours, right now.

Lenny RachitskyMaybe the opposite end of the spectrum, getting very tactical, before we get into the larger framework, what's just one thing that you think an eng team, a product team can do this week, next week to help their developer experience maybe get more done?

Nicole ForsgrenHonestly, I think the best thing you can do is go talk to people and listen. I love that the audience of this podcast is primarily PMs because they tend to be really good at this. And I would say start with listening and not with tools and automation. So many times companies are like, "Well, I'm just going to build this tool," or, "I'm going to build this thing." Often you build a thing that you yourself have had a challenge with or that is easy to do, easy to automate. And if you just go talk to people and ask the developers like, "Think of yesterday, what did you do yesterday? Walk me through it. What were the points that were just delightful? What were the points that were really difficult? Where did you get frustrated? Where did you get slowed down? Where was there friction?" If you go talk to a handful of people, a lot of times, you can surface a handful of things that are relatively low lift and still have impact or you can identify a process that's unnecessarily complex and slow.

Lenny RachitskySo the listening to, I hear, almost is you want to help your teams move faster and be happier eng teams. Your advice is just, "Before you do anything, just go ask them what is bothering you."

Nicole ForsgrenGo ask them, yeah. And trust me, most developers are going to be more than happy to tell you what's broken and what's bad. I'll say, there was one company that I had worked with. I remember they had a process that was really difficult and it was on an old mainframe system, and they were going to have to replat the whole thing and so they never went to work on it or talk about it. Everyone hated it because it was this huge delay. I mean, all they had to do was change a process. Sometimes all you have to do is change a process. And they changed it so that instead of... I think someone had to print it out and walk it down three or four flights, and they get approval. And then someone else had to walk it back up, and so it was just that interim. They didn't replat anything. They didn't redesign anything major. They just sent an email.

Lenny RachitskyLet me push on that and... I'm curious just what are the most common things people do. If you're just starting on, "Okay, we need to focus on engineering experience," what do you find are the most... two or three most common improvements companies need to make?

Nicole ForsgrenI'll say, I'll kind of echo that process, there's almost always a process that can be improved and that can be improved without a lot of engineering lift or a lot of engineering headcount. Most large companies, in particular, have something that is several, several steps. It's the way it is because it's the way it is, but that's no longer the way it is. And even small companies sometimes is just a little too YOLO, and you don't know what it is and you're kind of chasing everyone around. So if you can create a very lightweight process, that can also be helpful. That can be one of the best places to start, especially if you have limited exposure to the whole rest of the org. Sometimes just a team process can help.

I will say from a business leader's standpoint, a lot of what you can do is provide structure and support for this organizational change. Communicate what you're doing, communicate what the priorities are, communicate why this is important, to celebrate wins. Because if folks try to do this, just like a one-off side fully-isolated project, it's really challenging to get some good momentum, to get people to care, and to get them stay involved. Because it feels like it's just another internal project that isn't going to matter or that isn't going to get celebrated, but it has these huge upside potential returns for the business.

Lenny RachitskyIt's interesting, what I'm hearing here is nothing about tools or technologies. It's not like move to this cloud, it's not like install this new deployment system, it's processes and people and org and morale.

Nicole ForsgrenYeah. Now, there will be technical pieces that are very important, especially now with AI, where we're rethinking how build and test systems work. We're rethinking feedback to users so that it's very, very customized in terms of what is shared and when it is shared. There are a lot of technical pieces that are involved, but that's not the only thing. It's necessary but not sufficient, and that doesn't have to be the place that you start.

Lenny RachitskyI have a hard question I want to ask you that I thought of as you were talking. I feel like this is the question that most founders and heads think about. And the question is just like, how do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can? What are just maybe smells, signs that tell you, "Yeah, my team should be moving faster," versus, "This is just the way it works. This is as fast as they can move"?

Nicole ForsgrenMost teams can move faster, right? Also, given what we know about cognitive load, not all speed gains are necessarily good. Or the upside is going to be kind of limited once you hit kind of a certain point, and most people are not even near that point. I don't know a single team, frankly. But how do you know? You know if you're always hearing about bills breaking, flaky tests, overly long processes, if you have to request a new system or if you need to provision a new environment, or if it's really, really hard to switch tasks or switch projects. So if someone has an opportunity to go work in another part of an org and they don't for reasons that are unclear, and not political, and anyone says anything about the system, that's usually a pretty good smell that there's friction somewhere.

Because once you finally figure out your system and you're able to get work done, the switching costs can often be really, really high to go anywhere else. So sometimes people will do that. But I've worked with companies where switching orgs within the company, you had to basically pay the same tax as a new hire because the systems were so different and they were so full of friction, and it was so difficult to do so many things.

Lenny RachitskyI love the first part of your answer especially, which is you can always move faster. I think every founder is going to love hearing that. To your point though, there's diminishing returns over time?

Nicole ForsgrenYeah. And you don't know about the quality, right? So I think that's the other side is that you can always move faster, but faster for what? Are we making the right business decisions? And I think that's especially where PMs come in. We can ship trash faster every single day. We need strategy and really smart decisions to know what to ship, what to experiment with, what features we want to do in what order and what rollout. The strategy is the core piece, and then think about speeding that up. If we don't have the other pieces in place, I mean, garbage in, garbage out.

Lenny RachitskyI want to follow that thread, but before I do that, just to mirror back what you shared. So signs that your team... There's a lot of low-hanging fruit to improve the productivity of your team as builds are always breaking. There's flaky tests are constantly incorrect, false positives. It's hard to context switch between different projects. You just hear people talking about the system, it's just really hard to work with. Is that roughly right?

Nicole ForsgrenYeah.

Lenny RachitskyCool, okay. So going back to the point you just made, there's a sense that AI is making teams so much faster because it's writing all this code for them. You're going to have all these asynchronous agents, engineers working for you. It feels like a core part of your message is that's just a one part of engineering work and there's so much more, including figuring out what to build... an alignment internally. Maybe just speak to just... There is a lot of opportunity to improve engineering performance productivity, but there's so many other elements that are not improved through AI?

Nicole ForsgrenYes. Or could be in the future, right?

Lenny RachitskyMm-hmm.

Nicole ForsgrenI think there are a lot of ways that we can pull in AI tools to help us refine our strategy, refine our message, think about the experimentation methods or targets of experimentation, or think about our total addressable market, but we need to have that strategy and plan fairly well aligned or at least have two or three alternatives that you want to test. Because now, the engineering can go, or at least the prototyping especially, much, much faster. We can throw out prototypes. We can run any tests and experiments that are customer facing, assuming that we have the infrastructure in place, which allows us to learn and progress much faster before. In some places, it used to take months to get something through production to do A/B testing and get feedback. We can do this in a day or two, definitely under a week. But we want to make sure that we're building and testing the right things, "Are we partnering with the right... Do we have the data that we need?"

And I will say AI can actually be a pretty good partner there if you have a good conversation with it, and then also check with you experts, "What type of data should I be looking at? What type of instrumentation do I need? What type of analysis can I do?" Because then, you can also go to your data science team and say, "I'm planning on doing this. I'd like to..." Let's not just YOLO A/B tests because that can be... It's a shame to do a large test and end up disrupting users or disrupting customers, or breaking privacy or security protocols and also end up with data that's unusable because you just can't get the signal that you're looking for. But now, I'm also seeing people kind of accelerate that into a few days versus a few weeks. So they can start those key stakeholder discussions from a much more informed kind of filled out space.
Like I mentioned earlier, I use Coda every single day. And more than 50,000 teams trust Coda to keep them more aligned and focused. If you're a startup team looking to increase alignment and agility, Coda can help you move from planning to execution in record time. To try it for yourself, go to coda.io/lenny today and get six months free of the team plan for startups. That's C-O-D-A-dot-I-O-slash-Lenny to get started for free and get six months of the team plan, coda.io/lenny.
I love that you work with a bunch of different companies and a bunch of different types of businesses. I think very few people get to see inside a lot of different places. What kind of gains are you just seeing in terms of increased productivity with AI? How big of a gain have you seen?

Nicole ForsgrenI'd say it's real, and I would also say we don't have great measures for it yet. We're still trying to figure out what to measure and what that looks like. One of the best is going to be velocity, all the way through the system, how quickly can you get a feature or a product or something through the system so that you can then experiment a test, either from idea to final end or even kind of a feature and a piece through the system so we can test. That's really good. Now, that's also hard to tie back directly to a particular AI tool in the hands of a particular developer. But there are some other things that we can look at and we can see, and that I've seen is, again, this kind of rapid prototyping.

I hate lines of code, but I'm going to use the lines of code. We do see... I know I worked with some folks who had kind of a whole set of companies they were looking at, and they found that AI was generating significantly more code for the people who were using it regularly. But then, they also found that for folks who were regular users of AI coding environments, AI ADEs, the tool kind of gave them more code. And then the engineers themselves, the increase was double what the coding agent had given them. So one, I'd say, probably it's kind of a secondary or knock on or just a smell is it can unblock you. It can speed up the work that you would already do. I know sometimes when I work, the first few minutes, it's hard for me to start. But once I get started, I'm there. So they're really good at unblocking and unlocking that.

Lenny RachitskySomething I've seen people on Twitter sharing is how good OpenAI Codex, especially, is at finding really gnarly bugs. And I think it was Karpathy that shared it. He was so stuck on a bug and, no AI tool could figure it out. And then the latest version of Codex spent an hour or something, looking into it, and found it for him.

Nicole ForsgrenYeah. I'm hearing incredible things like that, right? Well, and even also writing unit tests and spinning up unit tests, and creating documentation and cleaning up documentation because I know now people are like, "Oh. Well, we have agents. I don't need to read the docs because there's the code there." It turns out, agents rely on good data because it's all about how they've been trained or how they've been grounded. And better data gives you better outcomes, and some of that data includes documentation and comments. The better documentation and the better comments you have, the better performance you're going to get out of your AI tools.

Lenny RachitskyAnd AI can help you write that documentation. I've been working with Devin a little bit, and it's really good at that stuff.

Nicole ForsgrenYeah.

Lenny RachitskyOkay. Let's talk about this framework, this book. So you're publishing a book called Frictionless, which sounds like a dream, "How do you create a dev team that's frictionless?" It's called Frictionless: 7 Steps to Remove Barriers, Unlock Value, and Outpace Your Competition in the Age of AI. There's a seven-step process to this. Walk us through this and maybe give us just context on this book, who it's meant for, what problem it solves, and then the seven steps.

Nicole ForsgrenI will say, I also wrote this with Abi Noda who has just... of DX. He has incredible experience in the space. He's worked with hundreds of companies and so it was kind of nice bouncing ideas off of him. Also, thanks to all of the engineering leads and DevEx leads, and CTOs, and engineers that we talked to to make sure that our smells were right. So who is this book for-

Lenny RachitskyLet me take a tangent on Abi, and DX, since you mentioned him. This is super interesting, and I think it connects so directly with this conversation. Abi started this company called DX, which is such a great name for a company around developer experience. They just sold the company for a billion dollars to Atlassian. It's a very high multiple on their ARR. It, to me, shows exactly why this conversation is so valuable, just how much value companies are putting into improving developer experience. Atlassian would spend a billion dollars on this. It's an early stage-ish startup. It was doing really well and people loved it, but it was like early stage-ish, a billion dollars. And the idea is they have all these companies working using Jira and all their products. They're all trying to figure out how do we measure productivity. It's worth a lot of money to them. And I know you were an early advisor to them too, so-

Nicole ForsgrenYeah.

Lenny Rachitsky... it just shows us how important this is.

Nicole ForsgrenYeah. Well, I think it also shows us how much value you can get out of this. There's so much low-hanging fruit, there's so much unlocked potential, and it's hard to know where to start a lot of times even in... I've been at large companies that have a lot of expertise and a lot of really, really smart people. But if you haven't kind of been in this space and thinking about it this way, it's hard to know where to start or it's easy to make simple mistakes up front that mean you kind of need to start over later. So I guess it also brings us back to, "Who is this book for?" It's for anyone that cares about DevEx, so definitely technology leaders, anyone who's trying to kick off a DevEx program, or is working on a DevEx DevEx improvement program. I think it's particularly relevant for PMs because if you're PMing something that involves software building and creating software, improving DevEx will only help your team. And also, you have key skills and insights and instincts that are so important to DevEx that many times, I will say, I've seen engineering teams just miss.

Lenny RachitskyOkay. What's the framework? What are the steps? Where do people start?

Nicole ForsgrenThe book goes through a seven-step process, and then also kind of provides some key kind of principles at the end. Step one is to start the journey. So assuming you're kicking off, you can start the journey. And this involves what we have already talked about. Go talk to people, have a listening tour, synthesize what you learn, visualize the workflow and tools, get a handle on what the current state is. Step two is to get a quick win. So start small, get a quick win, pick the right projects, share out what you've done. Step three is using data to optimize the work. So establish some of your data foundation, find the data that's there, start collecting new data, use some surveys for some really fast insights and may include example surveys. Step four then is to decide strategy and priority. Once you have some data, then you need to know of all the things that are potentially broken. And if you've already gotten your quick win of all the things that are left, "What should I do next?" So we walk through some evaluation frameworks there.

Step five is to sell your strategy. Once you've decided, now you have to kind of convince everyone else. So now you want to get feedback, you want to share why this is the right strategy right now. Step six is to drive change at your scale. So here, we address folks that have local scope of control. If you're starting on just a dev team, you want to do it yourself, kind of grassroots effort or global scope of control. If you're the VP of developer experience or something, there are some things that you can leverage for a top down, and then how do you drive change when you're kind of somewhere in the middle, because you can leverage both types of strategies. And then step seven is to evaluate your progress and show value, and then kind of loop back around.
I will say that we wrote this so that you could kind of jump into any step wherever you are right now. If you're kicking off a team or an initiative, you'll probably want to start at step one. You should definitely start at step one. If you're joining an existing initiative, you could jump into picking the priority or implementing the changes. So those are the seven steps. There's a seven steps, there are a few practices that we also recommend. So thinking about resourcing it, change management, making technology sustainable, and then also bringing a PM lens to this, "How can we think about developer experience as a product, and how do we think about the metrics that we have as a product?"

Lenny RachitskyAwesome, okay. I have questions. Point people to the book real quick. What's the URL? How do they get it? When does it come out?

Nicole ForsgrenYeah, developerexperiencebook.com. Right now, you can sign up for the mailing list. We'll let you know when it's out on pre-order, and we'll also be sharing pieces of the workbook. So we've got almost a hundred page workbook that goes along with the book, and then it should be out by end of year.

Lenny RachitskyOkay. So one piece of this is just this term developer experience feels very intentional in that it's not developer productivity, developer work. It's how do we make developer experiences better at our company, which includes they get more done, but also they're happier and things like that. So I think that's an important element of this, right?

Nicole ForsgrenYeah, absolutely.

Lenny RachitskyOkay.

Nicole ForsgrenBecause, again, it's not just about productivity. We talked about this from the frame and the lens of, "We need to be building the right thing." And you want to be productive, but you also want to be thinking about... and this is what engineers are also just really incredibly good at, give them a problem and don't tell them how to solve it, and then they can solve it better. They have the freedom, they have the innovation, they have the creativity so that they can solve this problem. If it's only about productivity, then it's just lines of code or number PRs or whatever. But we really want to talk about value and how do we unlock value, and how do we get value faster. And that involves, yes, making them more productive and removing friction because then, they have the flow and the cognitive load and the things that we kind of talked about.

Lenny RachitskyAwesome, okay. And then say someone wants to start this team, what does it usually look like. At Airbnb, I remember this team forming. It was just like an engineer or two, getting it started and taking charge. What do you recommend as the pilot team, and then what does it look like as it grows?

Nicole ForsgrenThere are a few ways to do this, right? So if you're doing it yourself, you could do it with a couple of engineers, maybe a PM or a PGM or a TPM to kind of help communicate. Because really, comms plans are just so important here. On a small scale, what we want to do is look for those quick wins, look for things that you can do at small scale. Some folks call them things like paper cuts. There small things that you can do to help people see the value and feel the benefit themselves, "How can a developer's work get better? How can their day-to-day work get better? Kind of build momentum from there?" If you're working from a top-down structure and you have the remit, you still want some quick wins, but those quick wins can look a little more global in scale because you have the infrastructure or the backing to make different types of changes that aren't only local.

So an example of a small local change could be just cleaning up your tests, your test suites. Any team could do that, any team could do that. At more global scale, it might be changing organization-wide process that is just overly cumbersome or throwing some resourcing into cleaning up the provisioning environment.

Lenny RachitskyOkay. What kind of impact have you seen from teams like this forming, on the engineering teams at their companies?

Nicole ForsgrenI'll say I've seen a huge impact for smaller companies, hundreds of thousands of dollars for large companies or in the billions. Well, also, we need to learn how to communicate that, "What does the math look like?" Many times, we can look at saving time, we can look at saving costs, we can look at a lot of different things. We can look at speed to value as speed to market. We can look at risk reduction, but the gains really are there. I will mention that it tends to follow something like the J-curve. So you'll have a couple of quick wins and it'll look like a big win, and then you'll hit kind of a little divot where suddenly the really obvious projects, the low-hanging fruit are handled. So now, we need to do a little bit of work. We might need to build out a little bit more infrastructure. We might need to build out a little more telemetry, so that we can capture the things we want to capture. And then once we get that done, then we start to see those benefits really compound.

Lenny RachitskySo going back to that measurement number, what do you recommend? How do people find these numbers? Because I think that's so much of the power of this is like, "We saved a million dollars doing this." What do you look at to figure that out?

Nicole ForsgrenI think there are a few different things to keep in mind, like who is our key audience, and we usually have a few key audiences. We really want to be able to speak to developers because they're the ones that are going to be using the systems. They'll be partnering with you on either building them or at least providing feedback about what you're doing. So for them, we often want to frame this in terms of things they care about. So time savings. If something gets faster, they can save time. They don't spend time doing setup when they don't need to anymore, related to status reduced toil. So compliance and security are super important. Also, many times it requires several manual steps that... I don't say they're not value add. They're not value add from an individual human perspective. If we can automate as much as possible, that's great, and improved focus time.

That's from the developer side of you. Leadership often cares about... They care about those things, but they often care more about other things. So we could talk about usually costs in dollars, "Can we accelerate revenue? What does our time to value look like? What is our velocity? How quickly can we get feedback from customers?" And for folks and organizations that are in really competitive environments, that can be really compelling because it's all about speed. We could talk about saving money. Here, we can look at maybe quantifying savings. One example is test and build. If we can clean up a test and build suite to a developer, they really want to hear about time saved and more reliable systems. There's less toil because they don't have to keep re-running tests or kind of go clean up test suites.
From the business perspective, cleaning up a test in a build suite can be cloud cost savings because all of those tests are running somewhere on a cloud. And if they always fail or if it's just kind of a waste of spend, that can be useful, recovering some capacity. We can always talk about time and productivity gains, "How much equivalent developer time are we losing on things that are not necessarily value add?" And then sometimes we can correlate to business outcomes and correlate is usually the best we can do here, but there can be some pretty compelling correlations in terms of speeding up time to value and increase market share, for example.

Lenny RachitskyLet me follow that thread and come back to this, what I think is the biggest question people have right now with AI and productivity, and I don't think anyone has the answer yet, but I'm curious to get your take of just what should people do today? What's the best approach to understanding what impact AI tools are having on their productivity? Because they're spending all this money on there. I don't know, what are we getting out of this? So I guess things are moving faster, but I don't know. So if someone had to just like, "Okay, here's what I should probably try to do," what would be your best advice here for measuring the impact of AI tools on productivity?

Nicole ForsgrenI would say it depends. In part, it depends on what your leadership chain really cares about. We are usually pretty good at figuring out what matters to developers and we could communicate that to them. But if we're trying to just identify two or three data points to really kind of focus on, because when we're first starting with data, sometimes it can be challenging, what do they care about? Think about the messaging you've been hearing. Have they been talking about market share? Losing market share or competitiveness in the marketplace, if that's it, focus on speed. Think about ways that you can capture metrics for speed from feature to production or feature to customer or feature to experiment and what that feedback loop looks like if they're talking about profit margin all the time.

Now, we always talk about money because this is business. But if that seems to be an overarching narrative, look for ways that you can save money and then translate that into recovered and recouped headcount cost. Or sometimes you'll reinvent, change a process, and then you no longer need as many vendors. So reductions in vendor spent can also help there. I say also it depends because sometimes they'll say something, leadership will say something, and it kind of comes up as a theme. If you could solve a problem that they have or it's something that they're focused on, if you can slightly reframe it even, like if they're calling everything developer productivity, go ahead and call it productivity. If they're calling it velocity, and velocity is what matters to them, think about how to frame this in terms of velocity. If they're talking about transformation or disruption, how does this help with the disruption? Because then, it will resonate with them. We don't want to make them work to understand what it is that we're doing and the value that we provide.

Lenny RachitskyThat is such good advice. Just to reflect back, the advice here is if your company's trying to figure out what sort of impact are AI tools having on our company, first, it's just like, what does the company care about most? What do leaders care about most? Could be market share, could be profit margin, could be velocity. We need higher velocity or we need to transform, transformation. So your advice there is figure that out based on words and phrases you're hearing. Then figure out ways to measure that, ways to measure market share growing, profit margin increasing. I love these examples, like time from feature, idea to production or to experiment, so maybe start tracking that. If it's margin, it's money saved by fewer tests, failing or some vendor you don't have to pay for, things like that. And then velocity, I imagine that's where things like DORA come in of just speed of engineering, shipping, or... What would you think about there for velocity?

Nicole ForsgrenI would say it's actually one of those... I would pick as broad a swathe as you can. So if you can go from idea to customer or idea to experiment, how long does that take? How long does it typically take, and how long can it take, and does it take now with improved use of AI tooling and reduction in friction? That's where I will say, we talk about this a little bit in the book, how do we deal with attribution challenges? What was responsible for this? Was it the DevEx or was it AI? Go ahead and disclose that. Say, "Yes, we rolled out AI tools. We also had this effort in DevEx. They partnered very closely together." Both of them probably contributed to this, right? If we had AI tools without the DevEx improvements, we probably would've had some improvements, but not nearly as much.

Lenny RachitskyIf people were starting to do this today, say they're just like, "I want to start measuring developer experience," are there a two or three metrics everybody basically needs they should just start measuring ASAP?

Nicole ForsgrenIf you're just starting today and if you have nothing at all, talk to people, obviously. After that, I would do surveys because surveys can give you a nice overall view of the landscape quickly so that you know where the big kind of challenges are. I say that because if you're just starting, you might not have instrumentation through your system, all the metrics. And if you do already, it might not be what you think you want. Metrics that were designed without purpose, questionable. Metrics that were designed for another purpose, they might work for what you want, but they might not, so we can't just assume we have them. That's one reason I like surveys, and we include an example in the book. You can just ask a few questions, "How satisfied are you? What are the biggest barriers to your productivity, or what are the biggest challenges to getting work done?" and let them pick either from a set of tools or maybe a set of processes and then say... Let them pick three, just three.

Of those three, how often does this affect you? Is this hourly? Is this daily? Is this weekly? Is this quarterly? Because sometimes it hits you every single day, and you're just mad about it. Sometimes it only hits you once a quarter because it's end of quarter, but it's so onerous, and then kind of open text, like, "Is there anything else we should know?" That can give you incredible signal because by making folks prioritize the top three things... Let them pick everything, it makes the data super, super messy. But three things and how often, you can just come up with a score or a weighted score if you want, and then go kind of dig into, where should that data be? What data do we need? But also, then you've got at least some kind of baseline. It'll be a subjective baseline, but now you'll know what the biggest challenges are.

Lenny RachitskyI love how all this just comes back just starting by talking to people and asking them these things, which is very similar to product management and just building great products is, have you talked to your customers? Everyone thinks they're doing this, but most people are not doing this enough.

Nicole ForsgrenAnd I will say one thing that's challenging when you think about getting data, so interviews are data and that's important, surveys are a little more quantified because we can turn it into counts, but that's where we also want to be careful. A lot of folks go to write a survey question and they'll say something like, "Were the build and test system slow or complicated in the last week?" You're asking four different questions there. If someone answers yes, was it the build? Was it the test? Was it slow or was it flaky or complicated or something? So it can be really difficult to untangle what the signal is you're actually getting there, and so it is worth the time chatting with someone who's familiar with survey design, having a conversation with Claude or Gemini or ChatGPT around, "Here are the survey questions. Or can you propose some?" And then make sure you take a couple of rounds. Is this a good survey question? What questions can I answer from the data that I get? What problems could I solve? If you can't answer a question with data, don't get it.

Lenny RachitskyAnd you have example surveys in your book for folks that want to just copy and paste and not have to think about this much.

Nicole ForsgrenYeah, example surveys, a lot of example questions. We even recommend what the format, what the flow should look like, how long it should be, how long it should not be.

Lenny RachitskyOne thing that I was reading is that you don't love happiness surveys specifically, asking engineers how happy they are, is that true? If so, why is that?

Nicole ForsgrenI don't, no. Well, I'll say I don't love a happiness survey because there are too many things that contribute to happiness. Happiness is a lot, right? So happiness is work, happiness is family, happiness is hobbies, happiness is weekends, happiness... There are so many things that contribute to happiness. Now, that doesn't mean I don't care about happiness. I think happiness surveys are not particularly useful here. What can be helpful is satisfaction and people are like, "That's the same thing." It's not because you can ask, "Are you satisfied with this tool?" and then ask some follow-up questions. Now, those two are related because the more satisfied you are with your job and your tools and the work and your team, it contributes to happiness. I used to joke... Remember the golf commercials like, "Happy cows like happy cheese"?

Lenny RachitskyNo.

Nicole ForsgrenI had a Calabrian. That was the best. Happy devs make happy code. They write better programs, they do better work, they're better team members and collaborators. But capturing and trying to directly influence happiness, that's not what we are here for. It's too challenging, it's too all-encompassing. Satisfaction can give us some signal.

Lenny RachitskyIn a totally different direction, in terms of just tools you see people using, are there any that just like, "Oh, yeah, this one's really commonly great." For people, this is just a tool people are finding a lot of success with. There's the common ones, Copilot, Cursor. I don't know. Is there anything that stands out that you want to share, just like, "Hey, you should check this tool out. People seem to love it"?

Nicole ForsgrenI think they're huge, right? Copilot, Cursor, Gemini.

Lenny RachitskyClaude Code.

Nicole ForsgrenYep, Claude Code. I love Claude Code.

Lenny RachitskyI have a whole post coming on ways to use Claude Code for non-engineering use cases.

Nicole ForsgrenCool. Nice.

Lenny RachitskyIt's so interesting. For example, Claude Code, "Find ways to clean up storage on my laptop," and it just tells you there's a bunch of files. It's just like ChatGPT running on your computer and you could do all kinds of crazy stuff on your computer for you, like a mini God.

Nicole ForsgrenI'm going to do that now. This is great.

Lenny RachitskyIt's so good. Yeah, that's why I'm writing this. I had Dan Shipper was on the podcast and he said Claude Code is the most underrated AI tool out there because people don't realize what it's capable of. It's not just for coding, and that's what I'm trying to explore more and more. Okay. Is there anything else that you think would be valuable to help people improve their developer experience, help them adapt to this new world of AI and engineering that we haven't covered?

Nicole ForsgrenI think something that's important to think about in general is to bring a product mindset to any type of DevEx improvements that are happening, and also the metrics that we collect and capture. By that, I mean we want to identify a problem, make sure we're solving a problem for a set of users. We want to think about creating MVPs and experiments and get fast feedback, do some rapid iteration. We want to have a strategy. We want to know who our addressable market is. We want to know what success is. We want to basically have a go-to-market function. We need to have comms. We need to get continuous feedback from our customers. We want to keep improving. And, at some point, we want to think about sunsetting something. Is it in maintenance mode? Is it sun setting?

And I think that's important in general, but I think it's extra important now because when we have AI tools, we're using AI tools, we're embedding AI into our products, things are changing so rapidly that it can be really important to take half a beat and say, "Okay, what's the problem I'm trying to solve right here? Is this metric that we've had for the last 10 years still important or should this be sunset because it's not really important anymore? It's not driving the types of decisions and actions that I need."

Lenny RachitskyBefore we get to our exciting lightning round, I want to take us to AI Corner, which is a recurring segment on this podcast. Is there some way that you've found a use for an AI tool in your life, in your work that you think might be fun to share, that you think might be useful to other people?

Nicole ForsgrenI have been working on some home design and redecorating rooms and stuff. I'm working with a designer because I know what I like, but I don't know how to get there, I'm not good at this. But I've really been loving ChatGPT and Gemini especially to render pictures for me, so I can give it the floor plan, I can give it one shot of the room that's definitely not what it's supposed to look like, and then I can give it pictures of a couple different things, and then I can just tell it change the walls or change the furniture layout or change something. It helps me and it's relatively quick. It helps me kind of visualize the things... Again, I know what I like, but I don't know how to get there, so I know if I like it or not, which is probably a very random use, but it's fun for now.

Lenny RachitskyMy wife does exactly the same thing. She's sending me constantly, "Here's what this rug will look like in our living room. Here's this water feature." It's so good and it keeps getting better. It's just like, "Wow, that's exactly our house with this new rug," and all you do is just upload these two photos and just like, "Cool. How would this look in our room?"

Nicole ForsgrenYeah, I've been impressed a couple times. Definitely the machines are listening to us. It's given me a mock-up of a room or something and then it throws in a dog bed, because I have dogs. I'm like, "I did not tell you to do that, but yeah, that's probably the color and style of dog bed that I should have in this room."

Lenny RachitskySpeaking of that, have you tried this use case, ask ChatGPT, "Generate an image of what you think my house looks like based on everything you know about me."

Nicole ForsgrenI haven't.

Lenny RachitskyBecause it has memory and it remembers everything you've talked about, and it's hilarious. You got to do it.

Nicole ForsgrenOkay, that's on my to-do list.

Lenny RachitskyThere we go. Bonus use case. Nicole, with that, we've reached our very exciting lightning round. I've got five questions for you. Are you ready?

Nicole ForsgrenAwesome. Let's go.

Lenny RachitskyWhat are two or three books that you find yourself recommending most to other people?

Nicole ForsgrenOutlive by Peter Attia is fantastic. Another one that's I guess maybe related, I hurt my back so it's not great, Back Mechanic by Stuart McGill is incredible. Shout out to anyone who has hurt lower back. It's for a lay person to read through and figure out how to fix lower back problems. It's kind of a random one. I will say I love How Big Things Get Done. I can't pronounce the names. I think one's... There's Scandinavian, one is. It kind of dissects really large projects through recent-ish history and where they failed and why. And I think it's really interesting for us to think about, especially now in this AI moment where basically all of our at least software systems are going to be changing. So how do we think about approaching what is essentially going to be a very large project? And then, sorry, I'm going to throw in a bonus one, The Undoing Project by Michael Lewis. Matt Velloso recommended it to me, and it's so good.

Lenny RachitskyYes, I read that-

Nicole ForsgrenI audibly gasped at the last sentence.

Lenny RachitskyOh. I was like, "What?"

Nicole ForsgrenI was . Yeah, I was not expecting it.

Lenny RachitskyI read that and I do not remember that last sentence. Oh, man. Okay, cool. Next question. Do you have a favorite movie or TV show you recently watched and enjoyed?

Nicole ForsgrenI'll say I watch Love Is Blind. If I got to shut down at the end of the day, Love Is Blind is fun.

Lenny RachitskyThere's a new season out.

Nicole ForsgrenYeah, very excited... and Shrinking. Have you seen Shrinking?

Lenny RachitskyNo. I think I started The Therapist and yeah, I gave it a shot.

Nicole ForsgrenStrongly recommend it. It's cute.

Lenny RachitskySweet. Is there a product you've recently discovered that you really love? Could be an app, could be some kitchen gadgets, some clothing.

Nicole ForsgrenYeah, the Ninja Creami is-

Lenny RachitskyDid you say this last time?

Nicole ForsgrenI don't know. I may have. I don't think so.

Lenny RachitskySomebody said this and I still remember it. It's like-

Nicole ForsgrenIt's so good.

Lenny Rachitsky... you make ice cream and stuff with it, right?

Nicole ForsgrenYeah, and you can basically freeze a protein shake and then it turns it into ice cream-

Lenny RachitskyOh, man.

Nicole Forsgren... which is delicious. Another one is a Jura coffee maker. I'd love good coffee and I'm not great at making it, so I can just push the button and it'll give me anything I want, including lattes, cappuccinos or anything. So that's kind of fun.

Lenny RachitskySweet, okay. Do you have a favorite-

Nicole ForsgrenJust sugar and caffeine. I just need a power through the day.

Lenny RachitskyThere's the engineering productivity 101.

Nicole ForsgrenYes.

Lenny RachitskyOh, man. Okay, two more questions. Do you have a favorite life motto that you often find useful in work or life and come back to in various ways?

Nicole ForsgrenYeah, I think one that's come up a couple times, it's not a verbatim thing, I think it's more the vibe, hindsight is 2020, but it's also really dumb. I think if we made the best decision we could at the time with the information that we had available, then it is what it is. If you make a bad decision because you made a bad decision and you knew better, you had the information, not great. I don't think we give ourselves or other people enough grace because we always end up finding more information out later.

Lenny RachitskyHear, hear. Final question. I was going to ask you something else, but as we are preparing for this, you shared that you have a new role at Google. Maybe just talk about that, what you're up to there, why you joined Google, anything folks should know.

Nicole ForsgrenSure. I am senior director of developer intelligence and core developer. It's super exciting and super fun because of all of these things we've been talking about. It's focused on Google and all their properties and their underlying infrastructure, how can we improve developer experience, developer productivity, velocity, all of these things we've been talking about and, because kind of the numbers person, how do we want to think about measuring it, how does measurement change, how do feedback loops change, how can we improve the experience throughout and then kind of drive that change through an organization in ways that are meaningful and impactful and faster than they've been before.

Lenny RachitskyNice job, Google, getting Nicole. What a win. I need to get some more Google stock ASAP. Okay, two follow-up questions. Where can folks find you online and find your book online if they want to dig deeper? And how can listeners be useful to you?

Nicole ForsgrenOnline, you can find the book at developerexperiencebook.com, I'm at nicolefv.com, and LinkedIn occasionally. Sometimes it's a mess. I try to wade through all of the noise. I get there to be useful, sign up for the book and the workbooks. The workbooks are free. I'd love to get any kind of feedback on what works, what doesn't. I always love hearing those kind of stories.

Lenny RachitskyNicole, thank you so much for being here.

Nicole ForsgrenThanks for having me, Lenny.

Lenny RachitskyMy pleasure. Thanks, again. Bye, everyone.

Thank you so much for listening. If you found this valuable, you can subscribe to the show on Apple Podcasts, Spotify, or your favorite podcast app. Also, please consider giving us a rating or leaving a review as that really helps other listeners find the podcast. You can find all past episodes or learn more about the show at lennyspodcast.com. See you in the next episode.

English Original transcript

Lenny RachitskyA lot of companies are trying to measure productivity for their teams.

Nicole ForsgrenMost productivity metrics are a lie. If the goal is more lines of code, I can prompt something to write the longest piece of code ever. It's just too easy to gain that system.

Lenny RachitskyHow do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can?

Nicole ForsgrenMost teams can move faster. But faster for what? We can ship trash faster every single day. We need strategy and really smart decisions to know what to ship.

Lenny RachitskyOne of the biggest issues we're going to probably have with AI is learning how much to trust code that it generates.

Nicole ForsgrenWe can't just put in a command and guess something back and accept it. We really need to evaluate it. Are we seeing hallucinations? What's the reliability? Does it meet the style that we would typically write?

Lenny RachitskySo much of the time is now going to be spent reviewing code versus writing code.

Nicole ForsgrenThere's some real opportunity there to not just rethink workflows, but rethink how we structure our days and how we structure our work. Now, we can also make a 45-minute work block useful because getting into the flow is actually kind of handed off, at least, in part to the machine or the machine can help us get back into the flow by, reminding us of context and generating diagrams of the system.

Lenny RachitskyWhat's just one thing that you think an eng team, a product team can do this week, next week to get more done?

Nicole ForsgrenHonestly, I think the best thing you can do-

Lenny RachitskyToday, my guest is Nicole Forsgren. With so much talk about how AI is increasing developer productivity, more and more people are asking, "How do we measure this productivity gain? And are these AI tools actually helping us or hurting how our developers work?" Nicole has been at the forefront of this space longer than anyone. She created the most used frameworks for measuring developer experience called DORA and SPACE. She wrote the most important book in the space called Accelerate and is about to publish her newest book called Frictionless, which gives you a guide to helping your team move faster and do more in this emerging AI world. Her core thesis is that AI indeed accelerates coding. But developers aren't speeding up as much as you think because they still have to deal with broken builds and unreliable tools and processes, and a bunch of new bottlenecks that are emerging.

See what over 200,000 entrepreneurs love about Mercury. Visit mercury.com to apply online in 10 minutes. Mercury is a FinTech, not a bank. Banking services are provided through Mercury's FDIC-insured partner banks. For more details, check out the show notes. Here's a puzzle for you. What do OpenAI, Cursor, Perplexity, Vercel, FLAN, and hundreds of other winning companies have in common? The answer is they're all powered by today's sponsor, WorkOS. If you're building software for enterprises, you've probably felt the pain of integrating single sign-on, skim, RBAC, audited logs, and other features required by big customers. WorkOS turns those deal blockers into drop-in APIs with a modern developer platform built specifically for B2B SaaS.
Whether you're a seed-stage startup trying to land your first enterprise customer or a unicorn expanding globally, WorkOS is the fastest path to becoming enterprise-ready and unlocking growth. They're essentially Stripe for enterprise features. Visit workos.com to get started or just hit up their Slack support where they have real engineers in there, who answer your questions super fast. WorkOS allows you to build like the best with delightful APIs, comprehensive docs, and a smooth developer experience. Go to workos.com to make your app enterprise-ready today.
Nicole, thank you so much for being here and welcome to the podcast.

Nicole ForsgrenThank you. It's so good to be here.

Lenny RachitskyIt's so good to have you back. I was just watching our first episode, which we did two and a half years ago. I was watching it, and I was both shocked and not shocked that we barely talked about AI. The episode was called How to Measure and Improve Developer Productivity, and we got to AI barely like an hour in and we're just like, "Hmm, I wonder what's going to happen with AI and productivity." Does that just blow your mind?

Nicole ForsgrenYeah. Because it was just hitting the scene, it was the topic of so much conversation, and at the same time, so many things don't change. So many things are still important, so many things are the same. Yeah. It's also a little wild that it's been two and a half. Where does time go? Time is a social construct?

Lenny RachitskyYeah. Most of our conversation was just questions like, "Well, how might this impact people? How will we change the way we build product?" It was barely a thing back then. Now, it's the only thing that I imagine people want to talk about when they talk about engineering productivity. That's where we're going to be spending a lot of our time focusing on today. The reason I'm excited about this conversation, it feels like there's been so much money poured into AI tools increasing productivity. The fastest growing companies in the world are these engineering AI tools. And now, more and more people are just asking this question of just, "What gains are we getting out of this? How much is this actually helping us be more productive? How do we become more productive?"

You've been at the center of this world for longer than anyone. You've invented so many of the frameworks that people rely on now. So I'm really excited to have you back to talk about this stuff. I want to start with just this term DevEx, it's something that comes up a lot in this whole space, and we're going to hear this term a bunch in this conversation. Can you just explain what is DevEx, this term DevEx?

Nicole ForsgrenDevEx is developer experience. And when we think about developer experience, we're really talking about what it's like to build software, day to day, for a developer. So the friction that they face, the workflows that they have to go through, any support that they have. It's important because when DevEx is poor, everything else just isn't going to help. The best processes, the best tools, the best... whatever magic you have, if the DevEx is bad, everything kind of takes-

Lenny RachitskyWithin DevEx is productivity, and I think the key insight that you had and other folks in the space of that is not just productivity, but there's also engineering happiness. We're going to get into a lot of these parts, but just maybe speak to... there's productivity and there's broader components to engineers being successful at a company.

Nicole ForsgrenYeah. I love that point because productivity, first of all, is hard to define anyway. But if you're just looking at output, you can get there in a lot of different ways. But if you're getting there in ways that are high toil or high friction, then at some point, a developer is going to burn out. Or if it's super high cognitive load, if it's hard to even think about what you're doing because concentrating on the mechanics of... the plumbing of something, then you don't have the brain space left to come up with really innovative solutions and questions. So I love that it's kind of this self-reinforcing loop in terms of, "You do more work, you do better work." And it's better for people, it's better for the systems, it's better for our customers.

Lenny RachitskyI was going to get to this later, but I want to actually get to this right now, this idea of flow state for engineers. I was an engineer, actually, early in my career. I went to a school for computer science. I was an engineer for 10 years. The best part of the job for me was just this flow state you enter when you're coding and building, and just things feel like so fun. It feels like AI is making that harder in a lot of ways because there's all these agents you're working with now, there's all this code that's kind of being written for you. Talk about just the importance of flow state to a developer, happiness, developer productivity, and just what you've seen AI impacting. How you've seen AI impacting that?

Nicole ForsgrenWell, there are lots of different ways to talk about DevEx. One way to talk about it is kind of three key things that have components that are important of themselves, and they also kind of reinforce each other. Flow state is one of them, cognitive load is another, and then feedback loops are another. I think when you touch on this... Your question about flow state is a really good one, and I'll admit we're just a few years into this. We're still figuring out what the best flow state and cognitive requirements are for people in this because, to your point, sometimes we're getting interrupted all the time. You don't just get in the flow and lock down, and write a whole bunch of code and do the typing of a whole bunch of code as much anymore. Instead, you're kind of creating a prompt, getting some code back and reviewing the code, trying to integrate what's happening in the system, and that can really interrupt.

At the same time though, it can contribute to flow if... I've seen some senior engineers pull together some tool chains that are really incredible, where they figured out how to keep the flow going. The fast feedback loops really, really work well for them. They can kind of assign out different pieces to agents. It helps them keep in the flow in terms of... Instead of details and line-by-line writing, they're in the flow in terms of, "What's my goal? What are the pieces that I need to get there? How quickly can I get there? So then, I can step back and kind of evaluate everything, and then dive back in and fix some pieces."

Lenny RachitskyIs there anything more you could say about this engineer that figured out this really cool workflow, about just what that looks like?

Nicole ForsgrenI've spoken with a handful of them, and I've kind of watched them work. I haven't built it myself yet. It's on my list. They've been able to set up this really incredible workspace and workflow where... Right now, a lot of us play around with tools and... We'll put in a prompt and we'll get a few lines back or maybe we'll put in a prompt and we'll get whole programs back. Well, what they can do is they can... Many times I'll see them say, to help prime it, "This is what I want to build. It needs to have these basic architectural components. It needs to have this kind of a stack. It needs to follow this general workflow. Help me think that through," and it'll kind of design it for it. And then for each piece, it'll assign an agent to go work on each pace in parallel, and then it'll say and upfront, "These need to be able to work together, make sure it's architected correctly. Make sure we use appropriate APIs and conventions."

Then at the end, they can let it run for a few minutes. They can think through something else that's interesting or they anticipate is going to be hairy, and they come back to something that's probably a little better than vibe coded. Because they were so systematic about it upfront, they're much closer to something that looks like production code.

Lenny RachitskySo what I'm hearing is spending a little time upfront planning, what all these AI engineers are doing, versus just powering through and just figuring out as you go.

Nicole ForsgrenYeah.

Lenny RachitskyOkay, cool. Let me get to this quite a core question that I think on is a lot of people's minds. A lot of companies are trying to measure productivity for their teams, "Is this improving our productivity? Is this hurting our productivity?" So let me just start with this question, how are people doing this wrong currently when they try to measure their productivity gains with AI?

Nicole ForsgrenI'll say most productivity metrics are a lie. It's really tricky because, historically... Now, look, lines of code has always been a bad metric, but many folks still use lines of code-

Lenny Rachitsky.

Nicole Forsgren... yeah, as some proxy as some proxy for output or productivity or complexity or something. Well, now, for many of the systems, that they would sometimes whisper and not super talk about that uses lines of code, it's just blown out of the water because, "What do you mean by lines of code?" If the goal is more lines of code, I can prompt something to write the longest piece of code ever and add tons of comments. We know that agents and LLMs tend to be very verbose by definition, and so it's just too easy to gain that system and then introduce complexity and technical debt into all of the work that you're doing. I will say there are some things that we can kind of watch and pay attention to because... So lines of code as a productivity metric isn't great, it's pretty bad. But now, it's kind of more relevant if we can tease out which code came from people and which code came from AI because now we can answer downstream questions.

"What is the code survivability rate? What is the quality of our code? Is our code being fed back into trained systems? And for that code that's retraining systems later, especially if we're doing fine-tuning and local tuning, how much of that is machine generated? What types of loops is that creating, and what types of patterns or biases might it be inadvertently introducing?" On the one hand, it's not good as a productivity metric, but it can be useful. I'll even say the same for DORA. I have done DORA metrics, their speed metrics, their stability metrics. If that's all you're looking at, it's not going to be sufficient anymore because AI has now changed the way we think about feedback loops. They need to be much faster. Now, what DORA's meant for, kind of assessing the pipeline overall in terms of speed and stability. Still, that works. But we can't just blindly apply the existing metrics we've used before because we'll miss super important phenomenon and changes in the way people work.

Lenny RachitskyInteresting. You invented DORA, that was kind of the main framework people used for a long time to measure productivity. And then there's SPACE, there's Core 4, there's probably others. So what I'm hearing here is all these are kind of out of date now, where AI is contributing large portions of code.

Nicole ForsgrenI will say if it is a prescriptive metric, it needs to be used only in the way it was prescribed.

Lenny RachitskySo

Nicole ForsgrenDORA 4, there are four key metrics. There's two speed metrics, deployment frequency and lead time. So code commit to code deploy. There's stability metrics, MTTR and change fail rate. If those are used to assess the speed of the pipeline and the general performance of the pipeline, that's great. If you're trying to use those to understand... Because implied in that is feedback loops, right, because you used to kind of get feedback from customers. But we can't just use that blindly now when we're using AI, as an example, because we have feedback loops much earlier and not even just at the local build and test phase. We have feedback loops throughout, and even sometimes in the middle of some of the pipeline, that we really want to leverage in ways that weren't as useful before. I won't say they weren't possible, but we just didn't really focus there.

So those are prescriptive metrics. When we think about SPACE, SPACE is a framework. It doesn't tell you what metric to use. So I'll say, sometimes people get real frustrated because I didn't tell them what to measure. But now, I think that's the power of it. We're actually seeing that SPACE applies fairly well in these new emerging contexts like AI because we still want to look at... SPACE is an acronym. We still want to look at satisfaction. We still want to look at performance, what's the outcome. We still want to look at activity. Yes, in some ways, lines of code and number of PRs can be useful for something, or number of alerts or number of things, activities or counts. Seize communication and collaboration, this is also super important and useful because it's how our systems communicate with each other, and also how our people do. "What proportion of work is being offloaded to a chat bot versus talking to a senior engineer on the team?" More isn't always better and less isn't always better, it depends.
And then efficiency and flow, "Can people get in the flow? How much time does it take to do things? What is the flow like through our system?" Here, I would probably add a couple of dimensions. So chatting with some of the early authors to say trust. Not to say trust wasn't important before, but now it's very, very front of mind. Right? Before you build your code, if the compile comes back, you're fine. And that's the way it is. LLMs are non-deterministic. Right now, we can't just put in a command and guess something back and accept it. We really need to evaluate it, so, "Are we seeing hallucinations? What's the reliability? Does it meet the style that we would typically write? And if it doesn't meet, is that fine?" So it depends on... Prescriptive. You got to make sure you're using it fit for purpose. Right?

Lenny RachitskyWe're going to get to your current thinking on the best way to do this stuff. You have a book coming out that explains how to do this well, so we're going to get to that. One thing I wanted to highlight in our last chat that we had, you highlighted that one of the biggest issues we're going to probably have with AI is trust, understanding and learning how much to trust the code that it generates, and also how much... you said this, two and a half years ago, that so much of the time is now going to be spent reviewing code versus writing code. That's exactly what I'm hearing.

Nicole ForsgrenI think it'll be interesting to see how that impacts the way we structure work moving forward. We were talking about flow state and cognitive load. Now that our attention has to focus on things at certain times and it's broken up from how we used to do it, I think there's some real opportunity there to, not just rethink workflows, but rethink how we structure our days and how we structure our work.

Lenny RachitskyCan you say more about that? Just what is that? What are you thinking will be happening? Where do you think things go? What are you seeing working?

Nicole ForsgrenThis is purely speculative. But for example, Gloria Mark has done some really good work on attention and deep work, and humans can get about four hours of good deep work a day. That's about it.

Lenny RachitskyYeah,. I feel that.

Nicole ForsgrenThat's kind of the upper limit-ish for the most part, and I'm sure people are going to be like, "Well, I am superhuman and I can do-

Lenny RachitskyWhat if you take 20 grams of creatine?

Nicole ForsgrenRight. What if we microdose?

Lenny RachitskyYeah, exact;y.

Nicole ForsgrenYeah. So in the context of knowing we have about four hours of good deep work... I'm sure many of us have probably hit this, right? We have good periods. Maybe it's morning, maybe it's afternoon for folks. And then you hit a time where you're like, "I'm going to clean up my inbox because that is all I can do right now. I can be functional, but I'm not going to come up with my best innovative, problem solving, authoring, code writing work." A lot of times, the way to do that and to get into it is to have these long chunks to get into flow and to get that deep work. Usually, I'm two hours-ish. An hour can be tricky because it could take time to get into that state. Okay. Well, when we think about what it used to be like, back in the old days, three years ago, three and a half years ago, we could block off four hours of time and we could probably get two or three hours of really good work done. Because we were just focused, right? There were no interruptions, minimal interruptions.

Now, the nature of writing code and systems itself is interrupt driven or full of interruptions, at least, because you start something and then it interjects. So how do we think about that? Does that mean that a four-hour word block is still useful? Probably. But does that mean that now we can also make a 45-minute work block useful? Because getting into the flow is actually kind of handed off, at least, in part to the machine or the machine can help us get back into the flow by reminding us of context and generating diagrams of the system and all the things. So I think that's a really, really interesting area that's just ripe for questions and opportunity. And please, folks, do this research and come back to me because... It might not make my list, but it's such a great question.

Lenny RachitskyThat is so interesting. Essentially, every engineer is turning into an EM, engineering manager, coordinating all of these junior AI engineers. So your point is even if you have a 30-hour block, you can get deep into code, but you can unblock all these AI engineers that are running off doing tasks. Plus, your point is they remind you of just like, "Here's where you left off. Okay. You can just jump into this code, maybe make some tweaks."

Nicole ForsgrenYeah.

Lenny RachitskySo interesting. Let me zoom out a little bit and... Before we get into your framework for how to approach developer experience, the latest thinking you've got, beyond just obviously engineers doing more is great, what's your best pitch for why companies should really, really focus on developer experience?

Nicole ForsgrenI hate to say return of investment, but the business value is... the opportunity here is huge. In general, we write software for fun and for hobbies, but we also have software because it meets a business need. It helps us with market share, it helps us attract and retain customers, it helps us do all of these things. And I think DevEx is important because it enables all of that software creation, it enables all of that problem solving. It enables the super rapid experimentation with customers that... Before, you'd need a while for a prototype and maybe a little bit longer to actually flight it through an A/B test on a production system. You can do it in hours, right now.

Lenny RachitskyMaybe the opposite end of the spectrum, getting very tactical, before we get into the larger framework, what's just one thing that you think an eng team, a product team can do this week, next week to help their developer experience maybe get more done?

Nicole ForsgrenHonestly, I think the best thing you can do is go talk to people and listen. I love that the audience of this podcast is primarily PMs because they tend to be really good at this. And I would say start with listening and not with tools and automation. So many times companies are like, "Well, I'm just going to build this tool," or, "I'm going to build this thing." Often you build a thing that you yourself have had a challenge with or that is easy to do, easy to automate. And if you just go talk to people and ask the developers like, "Think of yesterday, what did you do yesterday? Walk me through it. What were the points that were just delightful? What were the points that were really difficult? Where did you get frustrated? Where did you get slowed down? Where was there friction?" If you go talk to a handful of people, a lot of times, you can surface a handful of things that are relatively low lift and still have impact or you can identify a process that's unnecessarily complex and slow.

Lenny RachitskySo the listening to, I hear, almost is you want to help your teams move faster and be happier eng teams. Your advice is just, "Before you do anything, just go ask them what is bothering you."

Nicole ForsgrenGo ask them, yeah. And trust me, most developers are going to be more than happy to tell you what's broken and what's bad. I'll say, there was one company that I had worked with. I remember they had a process that was really difficult and it was on an old mainframe system, and they were going to have to replat the whole thing and so they never went to work on it or talk about it. Everyone hated it because it was this huge delay. I mean, all they had to do was change a process. Sometimes all you have to do is change a process. And they changed it so that instead of... I think someone had to print it out and walk it down three or four flights, and they get approval. And then someone else had to walk it back up, and so it was just that interim. They didn't replat anything. They didn't redesign anything major. They just sent an email.

Lenny RachitskyLet me push on that and... I'm curious just what are the most common things people do. If you're just starting on, "Okay, we need to focus on engineering experience," what do you find are the most... two or three most common improvements companies need to make?

Nicole ForsgrenI'll say, I'll kind of echo that process, there's almost always a process that can be improved and that can be improved without a lot of engineering lift or a lot of engineering headcount. Most large companies, in particular, have something that is several, several steps. It's the way it is because it's the way it is, but that's no longer the way it is. And even small companies sometimes is just a little too YOLO, and you don't know what it is and you're kind of chasing everyone around. So if you can create a very lightweight process, that can also be helpful. That can be one of the best places to start, especially if you have limited exposure to the whole rest of the org. Sometimes just a team process can help.

I will say from a business leader's standpoint, a lot of what you can do is provide structure and support for this organizational change. Communicate what you're doing, communicate what the priorities are, communicate why this is important, to celebrate wins. Because if folks try to do this, just like a one-off side fully-isolated project, it's really challenging to get some good momentum, to get people to care, and to get them stay involved. Because it feels like it's just another internal project that isn't going to matter or that isn't going to get celebrated, but it has these huge upside potential returns for the business.

Lenny RachitskyIt's interesting, what I'm hearing here is nothing about tools or technologies. It's not like move to this cloud, it's not like install this new deployment system, it's processes and people and org and morale.

Nicole ForsgrenYeah. Now, there will be technical pieces that are very important, especially now with AI, where we're rethinking how build and test systems work. We're rethinking feedback to users so that it's very, very customized in terms of what is shared and when it is shared. There are a lot of technical pieces that are involved, but that's not the only thing. It's necessary but not sufficient, and that doesn't have to be the place that you start.

Lenny RachitskyI have a hard question I want to ask you that I thought of as you were talking. I feel like this is the question that most founders and heads think about. And the question is just like, how do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can? What are just maybe smells, signs that tell you, "Yeah, my team should be moving faster," versus, "This is just the way it works. This is as fast as they can move"?

Nicole ForsgrenMost teams can move faster, right? Also, given what we know about cognitive load, not all speed gains are necessarily good. Or the upside is going to be kind of limited once you hit kind of a certain point, and most people are not even near that point. I don't know a single team, frankly. But how do you know? You know if you're always hearing about bills breaking, flaky tests, overly long processes, if you have to request a new system or if you need to provision a new environment, or if it's really, really hard to switch tasks or switch projects. So if someone has an opportunity to go work in another part of an org and they don't for reasons that are unclear, and not political, and anyone says anything about the system, that's usually a pretty good smell that there's friction somewhere.

Because once you finally figure out your system and you're able to get work done, the switching costs can often be really, really high to go anywhere else. So sometimes people will do that. But I've worked with companies where switching orgs within the company, you had to basically pay the same tax as a new hire because the systems were so different and they were so full of friction, and it was so difficult to do so many things.

Lenny RachitskyI love the first part of your answer especially, which is you can always move faster. I think every founder is going to love hearing that. To your point though, there's diminishing returns over time?

Nicole ForsgrenYeah. And you don't know about the quality, right? So I think that's the other side is that you can always move faster, but faster for what? Are we making the right business decisions? And I think that's especially where PMs come in. We can ship trash faster every single day. We need strategy and really smart decisions to know what to ship, what to experiment with, what features we want to do in what order and what rollout. The strategy is the core piece, and then think about speeding that up. If we don't have the other pieces in place, I mean, garbage in, garbage out.

Lenny RachitskyI want to follow that thread, but before I do that, just to mirror back what you shared. So signs that your team... There's a lot of low-hanging fruit to improve the productivity of your team as builds are always breaking. There's flaky tests are constantly incorrect, false positives. It's hard to context switch between different projects. You just hear people talking about the system, it's just really hard to work with. Is that roughly right?

Nicole ForsgrenYeah.

Lenny RachitskyCool, okay. So going back to the point you just made, there's a sense that AI is making teams so much faster because it's writing all this code for them. You're going to have all these asynchronous agents, engineers working for you. It feels like a core part of your message is that's just a one part of engineering work and there's so much more, including figuring out what to build... an alignment internally. Maybe just speak to just... There is a lot of opportunity to improve engineering performance productivity, but there's so many other elements that are not improved through AI?

Nicole ForsgrenYes. Or could be in the future, right?

Lenny RachitskyMm-hmm.

Nicole ForsgrenI think there are a lot of ways that we can pull in AI tools to help us refine our strategy, refine our message, think about the experimentation methods or targets of experimentation, or think about our total addressable market, but we need to have that strategy and plan fairly well aligned or at least have two or three alternatives that you want to test. Because now, the engineering can go, or at least the prototyping especially, much, much faster. We can throw out prototypes. We can run any tests and experiments that are customer facing, assuming that we have the infrastructure in place, which allows us to learn and progress much faster before. In some places, it used to take months to get something through production to do A/B testing and get feedback. We can do this in a day or two, definitely under a week. But we want to make sure that we're building and testing the right things, "Are we partnering with the right... Do we have the data that we need?"

And I will say AI can actually be a pretty good partner there if you have a good conversation with it, and then also check with you experts, "What type of data should I be looking at? What type of instrumentation do I need? What type of analysis can I do?" Because then, you can also go to your data science team and say, "I'm planning on doing this. I'd like to..." Let's not just YOLO A/B tests because that can be... It's a shame to do a large test and end up disrupting users or disrupting customers, or breaking privacy or security protocols and also end up with data that's unusable because you just can't get the signal that you're looking for. But now, I'm also seeing people kind of accelerate that into a few days versus a few weeks. So they can start those key stakeholder discussions from a much more informed kind of filled out space.
Like I mentioned earlier, I use Coda every single day. And more than 50,000 teams trust Coda to keep them more aligned and focused. If you're a startup team looking to increase alignment and agility, Coda can help you move from planning to execution in record time. To try it for yourself, go to coda.io/lenny today and get six months free of the team plan for startups. That's C-O-D-A-dot-I-O-slash-Lenny to get started for free and get six months of the team plan, coda.io/lenny.
I love that you work with a bunch of different companies and a bunch of different types of businesses. I think very few people get to see inside a lot of different places. What kind of gains are you just seeing in terms of increased productivity with AI? How big of a gain have you seen?

Nicole ForsgrenI'd say it's real, and I would also say we don't have great measures for it yet. We're still trying to figure out what to measure and what that looks like. One of the best is going to be velocity, all the way through the system, how quickly can you get a feature or a product or something through the system so that you can then experiment a test, either from idea to final end or even kind of a feature and a piece through the system so we can test. That's really good. Now, that's also hard to tie back directly to a particular AI tool in the hands of a particular developer. But there are some other things that we can look at and we can see, and that I've seen is, again, this kind of rapid prototyping.

I hate lines of code, but I'm going to use the lines of code. We do see... I know I worked with some folks who had kind of a whole set of companies they were looking at, and they found that AI was generating significantly more code for the people who were using it regularly. But then, they also found that for folks who were regular users of AI coding environments, AI ADEs, the tool kind of gave them more code. And then the engineers themselves, the increase was double what the coding agent had given them. So one, I'd say, probably it's kind of a secondary or knock on or just a smell is it can unblock you. It can speed up the work that you would already do. I know sometimes when I work, the first few minutes, it's hard for me to start. But once I get started, I'm there. So they're really good at unblocking and unlocking that.

Lenny RachitskySomething I've seen people on Twitter sharing is how good OpenAI Codex, especially, is at finding really gnarly bugs. And I think it was Karpathy that shared it. He was so stuck on a bug and, no AI tool could figure it out. And then the latest version of Codex spent an hour or something, looking into it, and found it for him.

Nicole ForsgrenYeah. I'm hearing incredible things like that, right? Well, and even also writing unit tests and spinning up unit tests, and creating documentation and cleaning up documentation because I know now people are like, "Oh. Well, we have agents. I don't need to read the docs because there's the code there." It turns out, agents rely on good data because it's all about how they've been trained or how they've been grounded. And better data gives you better outcomes, and some of that data includes documentation and comments. The better documentation and the better comments you have, the better performance you're going to get out of your AI tools.

Lenny RachitskyAnd AI can help you write that documentation. I've been working with Devin a little bit, and it's really good at that stuff.

Nicole ForsgrenYeah.

Lenny RachitskyOkay. Let's talk about this framework, this book. So you're publishing a book called Frictionless, which sounds like a dream, "How do you create a dev team that's frictionless?" It's called Frictionless: 7 Steps to Remove Barriers, Unlock Value, and Outpace Your Competition in the Age of AI. There's a seven-step process to this. Walk us through this and maybe give us just context on this book, who it's meant for, what problem it solves, and then the seven steps.

Nicole ForsgrenI will say, I also wrote this with Abi Noda who has just... of DX. He has incredible experience in the space. He's worked with hundreds of companies and so it was kind of nice bouncing ideas off of him. Also, thanks to all of the engineering leads and DevEx leads, and CTOs, and engineers that we talked to to make sure that our smells were right. So who is this book for-

Lenny RachitskyLet me take a tangent on Abi, and DX, since you mentioned him. This is super interesting, and I think it connects so directly with this conversation. Abi started this company called DX, which is such a great name for a company around developer experience. They just sold the company for a billion dollars to Atlassian. It's a very high multiple on their ARR. It, to me, shows exactly why this conversation is so valuable, just how much value companies are putting into improving developer experience. Atlassian would spend a billion dollars on this. It's an early stage-ish startup. It was doing really well and people loved it, but it was like early stage-ish, a billion dollars. And the idea is they have all these companies working using Jira and all their products. They're all trying to figure out how do we measure productivity. It's worth a lot of money to them. And I know you were an early advisor to them too, so-

Nicole ForsgrenYeah.

Lenny Rachitsky... it just shows us how important this is.

Nicole ForsgrenYeah. Well, I think it also shows us how much value you can get out of this. There's so much low-hanging fruit, there's so much unlocked potential, and it's hard to know where to start a lot of times even in... I've been at large companies that have a lot of expertise and a lot of really, really smart people. But if you haven't kind of been in this space and thinking about it this way, it's hard to know where to start or it's easy to make simple mistakes up front that mean you kind of need to start over later. So I guess it also brings us back to, "Who is this book for?" It's for anyone that cares about DevEx, so definitely technology leaders, anyone who's trying to kick off a DevEx program, or is working on a DevEx DevEx improvement program. I think it's particularly relevant for PMs because if you're PMing something that involves software building and creating software, improving DevEx will only help your team. And also, you have key skills and insights and instincts that are so important to DevEx that many times, I will say, I've seen engineering teams just miss.

Lenny RachitskyOkay. What's the framework? What are the steps? Where do people start?

Nicole ForsgrenThe book goes through a seven-step process, and then also kind of provides some key kind of principles at the end. Step one is to start the journey. So assuming you're kicking off, you can start the journey. And this involves what we have already talked about. Go talk to people, have a listening tour, synthesize what you learn, visualize the workflow and tools, get a handle on what the current state is. Step two is to get a quick win. So start small, get a quick win, pick the right projects, share out what you've done. Step three is using data to optimize the work. So establish some of your data foundation, find the data that's there, start collecting new data, use some surveys for some really fast insights and may include example surveys. Step four then is to decide strategy and priority. Once you have some data, then you need to know of all the things that are potentially broken. And if you've already gotten your quick win of all the things that are left, "What should I do next?" So we walk through some evaluation frameworks there.

Step five is to sell your strategy. Once you've decided, now you have to kind of convince everyone else. So now you want to get feedback, you want to share why this is the right strategy right now. Step six is to drive change at your scale. So here, we address folks that have local scope of control. If you're starting on just a dev team, you want to do it yourself, kind of grassroots effort or global scope of control. If you're the VP of developer experience or something, there are some things that you can leverage for a top down, and then how do you drive change when you're kind of somewhere in the middle, because you can leverage both types of strategies. And then step seven is to evaluate your progress and show value, and then kind of loop back around.
I will say that we wrote this so that you could kind of jump into any step wherever you are right now. If you're kicking off a team or an initiative, you'll probably want to start at step one. You should definitely start at step one. If you're joining an existing initiative, you could jump into picking the priority or implementing the changes. So those are the seven steps. There's a seven steps, there are a few practices that we also recommend. So thinking about resourcing it, change management, making technology sustainable, and then also bringing a PM lens to this, "How can we think about developer experience as a product, and how do we think about the metrics that we have as a product?"

Lenny RachitskyAwesome, okay. I have questions. Point people to the book real quick. What's the URL? How do they get it? When does it come out?

Nicole ForsgrenYeah, developerexperiencebook.com. Right now, you can sign up for the mailing list. We'll let you know when it's out on pre-order, and we'll also be sharing pieces of the workbook. So we've got almost a hundred page workbook that goes along with the book, and then it should be out by end of year.

Lenny RachitskyOkay. So one piece of this is just this term developer experience feels very intentional in that it's not developer productivity, developer work. It's how do we make developer experiences better at our company, which includes they get more done, but also they're happier and things like that. So I think that's an important element of this, right?

Nicole ForsgrenYeah, absolutely.

Lenny RachitskyOkay.

Nicole ForsgrenBecause, again, it's not just about productivity. We talked about this from the frame and the lens of, "We need to be building the right thing." And you want to be productive, but you also want to be thinking about... and this is what engineers are also just really incredibly good at, give them a problem and don't tell them how to solve it, and then they can solve it better. They have the freedom, they have the innovation, they have the creativity so that they can solve this problem. If it's only about productivity, then it's just lines of code or number PRs or whatever. But we really want to talk about value and how do we unlock value, and how do we get value faster. And that involves, yes, making them more productive and removing friction because then, they have the flow and the cognitive load and the things that we kind of talked about.

Lenny RachitskyAwesome, okay. And then say someone wants to start this team, what does it usually look like. At Airbnb, I remember this team forming. It was just like an engineer or two, getting it started and taking charge. What do you recommend as the pilot team, and then what does it look like as it grows?

Nicole ForsgrenThere are a few ways to do this, right? So if you're doing it yourself, you could do it with a couple of engineers, maybe a PM or a PGM or a TPM to kind of help communicate. Because really, comms plans are just so important here. On a small scale, what we want to do is look for those quick wins, look for things that you can do at small scale. Some folks call them things like paper cuts. There small things that you can do to help people see the value and feel the benefit themselves, "How can a developer's work get better? How can their day-to-day work get better? Kind of build momentum from there?" If you're working from a top-down structure and you have the remit, you still want some quick wins, but those quick wins can look a little more global in scale because you have the infrastructure or the backing to make different types of changes that aren't only local.

So an example of a small local change could be just cleaning up your tests, your test suites. Any team could do that, any team could do that. At more global scale, it might be changing organization-wide process that is just overly cumbersome or throwing some resourcing into cleaning up the provisioning environment.

Lenny RachitskyOkay. What kind of impact have you seen from teams like this forming, on the engineering teams at their companies?

Nicole ForsgrenI'll say I've seen a huge impact for smaller companies, hundreds of thousands of dollars for large companies or in the billions. Well, also, we need to learn how to communicate that, "What does the math look like?" Many times, we can look at saving time, we can look at saving costs, we can look at a lot of different things. We can look at speed to value as speed to market. We can look at risk reduction, but the gains really are there. I will mention that it tends to follow something like the J-curve. So you'll have a couple of quick wins and it'll look like a big win, and then you'll hit kind of a little divot where suddenly the really obvious projects, the low-hanging fruit are handled. So now, we need to do a little bit of work. We might need to build out a little bit more infrastructure. We might need to build out a little more telemetry, so that we can capture the things we want to capture. And then once we get that done, then we start to see those benefits really compound.

Lenny RachitskySo going back to that measurement number, what do you recommend? How do people find these numbers? Because I think that's so much of the power of this is like, "We saved a million dollars doing this." What do you look at to figure that out?

Nicole ForsgrenI think there are a few different things to keep in mind, like who is our key audience, and we usually have a few key audiences. We really want to be able to speak to developers because they're the ones that are going to be using the systems. They'll be partnering with you on either building them or at least providing feedback about what you're doing. So for them, we often want to frame this in terms of things they care about. So time savings. If something gets faster, they can save time. They don't spend time doing setup when they don't need to anymore, related to status reduced toil. So compliance and security are super important. Also, many times it requires several manual steps that... I don't say they're not value add. They're not value add from an individual human perspective. If we can automate as much as possible, that's great, and improved focus time.

That's from the developer side of you. Leadership often cares about... They care about those things, but they often care more about other things. So we could talk about usually costs in dollars, "Can we accelerate revenue? What does our time to value look like? What is our velocity? How quickly can we get feedback from customers?" And for folks and organizations that are in really competitive environments, that can be really compelling because it's all about speed. We could talk about saving money. Here, we can look at maybe quantifying savings. One example is test and build. If we can clean up a test and build suite to a developer, they really want to hear about time saved and more reliable systems. There's less toil because they don't have to keep re-running tests or kind of go clean up test suites.
From the business perspective, cleaning up a test in a build suite can be cloud cost savings because all of those tests are running somewhere on a cloud. And if they always fail or if it's just kind of a waste of spend, that can be useful, recovering some capacity. We can always talk about time and productivity gains, "How much equivalent developer time are we losing on things that are not necessarily value add?" And then sometimes we can correlate to business outcomes and correlate is usually the best we can do here, but there can be some pretty compelling correlations in terms of speeding up time to value and increase market share, for example.

Lenny RachitskyLet me follow that thread and come back to this, what I think is the biggest question people have right now with AI and productivity, and I don't think anyone has the answer yet, but I'm curious to get your take of just what should people do today? What's the best approach to understanding what impact AI tools are having on their productivity? Because they're spending all this money on there. I don't know, what are we getting out of this? So I guess things are moving faster, but I don't know. So if someone had to just like, "Okay, here's what I should probably try to do," what would be your best advice here for measuring the impact of AI tools on productivity?

Nicole ForsgrenI would say it depends. In part, it depends on what your leadership chain really cares about. We are usually pretty good at figuring out what matters to developers and we could communicate that to them. But if we're trying to just identify two or three data points to really kind of focus on, because when we're first starting with data, sometimes it can be challenging, what do they care about? Think about the messaging you've been hearing. Have they been talking about market share? Losing market share or competitiveness in the marketplace, if that's it, focus on speed. Think about ways that you can capture metrics for speed from feature to production or feature to customer or feature to experiment and what that feedback loop looks like if they're talking about profit margin all the time.

Now, we always talk about money because this is business. But if that seems to be an overarching narrative, look for ways that you can save money and then translate that into recovered and recouped headcount cost. Or sometimes you'll reinvent, change a process, and then you no longer need as many vendors. So reductions in vendor spent can also help there. I say also it depends because sometimes they'll say something, leadership will say something, and it kind of comes up as a theme. If you could solve a problem that they have or it's something that they're focused on, if you can slightly reframe it even, like if they're calling everything developer productivity, go ahead and call it productivity. If they're calling it velocity, and velocity is what matters to them, think about how to frame this in terms of velocity. If they're talking about transformation or disruption, how does this help with the disruption? Because then, it will resonate with them. We don't want to make them work to understand what it is that we're doing and the value that we provide.

Lenny RachitskyThat is such good advice. Just to reflect back, the advice here is if your company's trying to figure out what sort of impact are AI tools having on our company, first, it's just like, what does the company care about most? What do leaders care about most? Could be market share, could be profit margin, could be velocity. We need higher velocity or we need to transform, transformation. So your advice there is figure that out based on words and phrases you're hearing. Then figure out ways to measure that, ways to measure market share growing, profit margin increasing. I love these examples, like time from feature, idea to production or to experiment, so maybe start tracking that. If it's margin, it's money saved by fewer tests, failing or some vendor you don't have to pay for, things like that. And then velocity, I imagine that's where things like DORA come in of just speed of engineering, shipping, or... What would you think about there for velocity?

Nicole ForsgrenI would say it's actually one of those... I would pick as broad a swathe as you can. So if you can go from idea to customer or idea to experiment, how long does that take? How long does it typically take, and how long can it take, and does it take now with improved use of AI tooling and reduction in friction? That's where I will say, we talk about this a little bit in the book, how do we deal with attribution challenges? What was responsible for this? Was it the DevEx or was it AI? Go ahead and disclose that. Say, "Yes, we rolled out AI tools. We also had this effort in DevEx. They partnered very closely together." Both of them probably contributed to this, right? If we had AI tools without the DevEx improvements, we probably would've had some improvements, but not nearly as much.

Lenny RachitskyIf people were starting to do this today, say they're just like, "I want to start measuring developer experience," are there a two or three metrics everybody basically needs they should just start measuring ASAP?

Nicole ForsgrenIf you're just starting today and if you have nothing at all, talk to people, obviously. After that, I would do surveys because surveys can give you a nice overall view of the landscape quickly so that you know where the big kind of challenges are. I say that because if you're just starting, you might not have instrumentation through your system, all the metrics. And if you do already, it might not be what you think you want. Metrics that were designed without purpose, questionable. Metrics that were designed for another purpose, they might work for what you want, but they might not, so we can't just assume we have them. That's one reason I like surveys, and we include an example in the book. You can just ask a few questions, "How satisfied are you? What are the biggest barriers to your productivity, or what are the biggest challenges to getting work done?" and let them pick either from a set of tools or maybe a set of processes and then say... Let them pick three, just three.

Of those three, how often does this affect you? Is this hourly? Is this daily? Is this weekly? Is this quarterly? Because sometimes it hits you every single day, and you're just mad about it. Sometimes it only hits you once a quarter because it's end of quarter, but it's so onerous, and then kind of open text, like, "Is there anything else we should know?" That can give you incredible signal because by making folks prioritize the top three things... Let them pick everything, it makes the data super, super messy. But three things and how often, you can just come up with a score or a weighted score if you want, and then go kind of dig into, where should that data be? What data do we need? But also, then you've got at least some kind of baseline. It'll be a subjective baseline, but now you'll know what the biggest challenges are.

Lenny RachitskyI love how all this just comes back just starting by talking to people and asking them these things, which is very similar to product management and just building great products is, have you talked to your customers? Everyone thinks they're doing this, but most people are not doing this enough.

Nicole ForsgrenAnd I will say one thing that's challenging when you think about getting data, so interviews are data and that's important, surveys are a little more quantified because we can turn it into counts, but that's where we also want to be careful. A lot of folks go to write a survey question and they'll say something like, "Were the build and test system slow or complicated in the last week?" You're asking four different questions there. If someone answers yes, was it the build? Was it the test? Was it slow or was it flaky or complicated or something? So it can be really difficult to untangle what the signal is you're actually getting there, and so it is worth the time chatting with someone who's familiar with survey design, having a conversation with Claude or Gemini or ChatGPT around, "Here are the survey questions. Or can you propose some?" And then make sure you take a couple of rounds. Is this a good survey question? What questions can I answer from the data that I get? What problems could I solve? If you can't answer a question with data, don't get it.

Lenny RachitskyAnd you have example surveys in your book for folks that want to just copy and paste and not have to think about this much.

Nicole ForsgrenYeah, example surveys, a lot of example questions. We even recommend what the format, what the flow should look like, how long it should be, how long it should not be.

Lenny RachitskyOne thing that I was reading is that you don't love happiness surveys specifically, asking engineers how happy they are, is that true? If so, why is that?

Nicole ForsgrenI don't, no. Well, I'll say I don't love a happiness survey because there are too many things that contribute to happiness. Happiness is a lot, right? So happiness is work, happiness is family, happiness is hobbies, happiness is weekends, happiness... There are so many things that contribute to happiness. Now, that doesn't mean I don't care about happiness. I think happiness surveys are not particularly useful here. What can be helpful is satisfaction and people are like, "That's the same thing." It's not because you can ask, "Are you satisfied with this tool?" and then ask some follow-up questions. Now, those two are related because the more satisfied you are with your job and your tools and the work and your team, it contributes to happiness. I used to joke... Remember the golf commercials like, "Happy cows like happy cheese"?

Lenny RachitskyNo.

Nicole ForsgrenI had a Calabrian. That was the best. Happy devs make happy code. They write better programs, they do better work, they're better team members and collaborators. But capturing and trying to directly influence happiness, that's not what we are here for. It's too challenging, it's too all-encompassing. Satisfaction can give us some signal.

Lenny RachitskyIn a totally different direction, in terms of just tools you see people using, are there any that just like, "Oh, yeah, this one's really commonly great." For people, this is just a tool people are finding a lot of success with. There's the common ones, Copilot, Cursor. I don't know. Is there anything that stands out that you want to share, just like, "Hey, you should check this tool out. People seem to love it"?

Nicole ForsgrenI think they're huge, right? Copilot, Cursor, Gemini.

Lenny RachitskyClaude Code.

Nicole ForsgrenYep, Claude Code. I love Claude Code.

Lenny RachitskyI have a whole post coming on ways to use Claude Code for non-engineering use cases.

Nicole ForsgrenCool. Nice.

Lenny RachitskyIt's so interesting. For example, Claude Code, "Find ways to clean up storage on my laptop," and it just tells you there's a bunch of files. It's just like ChatGPT running on your computer and you could do all kinds of crazy stuff on your computer for you, like a mini God.

Nicole ForsgrenI'm going to do that now. This is great.

Lenny RachitskyIt's so good. Yeah, that's why I'm writing this. I had Dan Shipper was on the podcast and he said Claude Code is the most underrated AI tool out there because people don't realize what it's capable of. It's not just for coding, and that's what I'm trying to explore more and more. Okay. Is there anything else that you think would be valuable to help people improve their developer experience, help them adapt to this new world of AI and engineering that we haven't covered?

Nicole ForsgrenI think something that's important to think about in general is to bring a product mindset to any type of DevEx improvements that are happening, and also the metrics that we collect and capture. By that, I mean we want to identify a problem, make sure we're solving a problem for a set of users. We want to think about creating MVPs and experiments and get fast feedback, do some rapid iteration. We want to have a strategy. We want to know who our addressable market is. We want to know what success is. We want to basically have a go-to-market function. We need to have comms. We need to get continuous feedback from our customers. We want to keep improving. And, at some point, we want to think about sunsetting something. Is it in maintenance mode? Is it sun setting?

And I think that's important in general, but I think it's extra important now because when we have AI tools, we're using AI tools, we're embedding AI into our products, things are changing so rapidly that it can be really important to take half a beat and say, "Okay, what's the problem I'm trying to solve right here? Is this metric that we've had for the last 10 years still important or should this be sunset because it's not really important anymore? It's not driving the types of decisions and actions that I need."

Lenny RachitskyBefore we get to our exciting lightning round, I want to take us to AI Corner, which is a recurring segment on this podcast. Is there some way that you've found a use for an AI tool in your life, in your work that you think might be fun to share, that you think might be useful to other people?

Nicole ForsgrenI have been working on some home design and redecorating rooms and stuff. I'm working with a designer because I know what I like, but I don't know how to get there, I'm not good at this. But I've really been loving ChatGPT and Gemini especially to render pictures for me, so I can give it the floor plan, I can give it one shot of the room that's definitely not what it's supposed to look like, and then I can give it pictures of a couple different things, and then I can just tell it change the walls or change the furniture layout or change something. It helps me and it's relatively quick. It helps me kind of visualize the things... Again, I know what I like, but I don't know how to get there, so I know if I like it or not, which is probably a very random use, but it's fun for now.

Lenny RachitskyMy wife does exactly the same thing. She's sending me constantly, "Here's what this rug will look like in our living room. Here's this water feature." It's so good and it keeps getting better. It's just like, "Wow, that's exactly our house with this new rug," and all you do is just upload these two photos and just like, "Cool. How would this look in our room?"

Nicole ForsgrenYeah, I've been impressed a couple times. Definitely the machines are listening to us. It's given me a mock-up of a room or something and then it throws in a dog bed, because I have dogs. I'm like, "I did not tell you to do that, but yeah, that's probably the color and style of dog bed that I should have in this room."

Lenny RachitskySpeaking of that, have you tried this use case, ask ChatGPT, "Generate an image of what you think my house looks like based on everything you know about me."

Nicole ForsgrenI haven't.

Lenny RachitskyBecause it has memory and it remembers everything you've talked about, and it's hilarious. You got to do it.

Nicole ForsgrenOkay, that's on my to-do list.

Lenny RachitskyThere we go. Bonus use case. Nicole, with that, we've reached our very exciting lightning round. I've got five questions for you. Are you ready?

Nicole ForsgrenAwesome. Let's go.

Lenny RachitskyWhat are two or three books that you find yourself recommending most to other people?

Nicole ForsgrenOutlive by Peter Attia is fantastic. Another one that's I guess maybe related, I hurt my back so it's not great, Back Mechanic by Stuart McGill is incredible. Shout out to anyone who has hurt lower back. It's for a lay person to read through and figure out how to fix lower back problems. It's kind of a random one. I will say I love How Big Things Get Done. I can't pronounce the names. I think one's... There's Scandinavian, one is. It kind of dissects really large projects through recent-ish history and where they failed and why. And I think it's really interesting for us to think about, especially now in this AI moment where basically all of our at least software systems are going to be changing. So how do we think about approaching what is essentially going to be a very large project? And then, sorry, I'm going to throw in a bonus one, The Undoing Project by Michael Lewis. Matt Velloso recommended it to me, and it's so good.

Lenny RachitskyYes, I read that-

Nicole ForsgrenI audibly gasped at the last sentence.

Lenny RachitskyOh. I was like, "What?"

Nicole ForsgrenI was . Yeah, I was not expecting it.

Lenny RachitskyI read that and I do not remember that last sentence. Oh, man. Okay, cool. Next question. Do you have a favorite movie or TV show you recently watched and enjoyed?

Nicole ForsgrenI'll say I watch Love Is Blind. If I got to shut down at the end of the day, Love Is Blind is fun.

Lenny RachitskyThere's a new season out.

Nicole ForsgrenYeah, very excited... and Shrinking. Have you seen Shrinking?

Lenny RachitskyNo. I think I started The Therapist and yeah, I gave it a shot.

Nicole ForsgrenStrongly recommend it. It's cute.

Lenny RachitskySweet. Is there a product you've recently discovered that you really love? Could be an app, could be some kitchen gadgets, some clothing.

Nicole ForsgrenYeah, the Ninja Creami is-

Lenny RachitskyDid you say this last time?

Nicole ForsgrenI don't know. I may have. I don't think so.

Lenny RachitskySomebody said this and I still remember it. It's like-

Nicole ForsgrenIt's so good.

Lenny Rachitsky... you make ice cream and stuff with it, right?

Nicole ForsgrenYeah, and you can basically freeze a protein shake and then it turns it into ice cream-

Lenny RachitskyOh, man.

Nicole Forsgren... which is delicious. Another one is a Jura coffee maker. I'd love good coffee and I'm not great at making it, so I can just push the button and it'll give me anything I want, including lattes, cappuccinos or anything. So that's kind of fun.

Lenny RachitskySweet, okay. Do you have a favorite-

Nicole ForsgrenJust sugar and caffeine. I just need a power through the day.

Lenny RachitskyThere's the engineering productivity 101.

Nicole ForsgrenYes.

Lenny RachitskyOh, man. Okay, two more questions. Do you have a favorite life motto that you often find useful in work or life and come back to in various ways?

Nicole ForsgrenYeah, I think one that's come up a couple times, it's not a verbatim thing, I think it's more the vibe, hindsight is 2020, but it's also really dumb. I think if we made the best decision we could at the time with the information that we had available, then it is what it is. If you make a bad decision because you made a bad decision and you knew better, you had the information, not great. I don't think we give ourselves or other people enough grace because we always end up finding more information out later.

Lenny RachitskyHear, hear. Final question. I was going to ask you something else, but as we are preparing for this, you shared that you have a new role at Google. Maybe just talk about that, what you're up to there, why you joined Google, anything folks should know.

Nicole ForsgrenSure. I am senior director of developer intelligence and core developer. It's super exciting and super fun because of all of these things we've been talking about. It's focused on Google and all their properties and their underlying infrastructure, how can we improve developer experience, developer productivity, velocity, all of these things we've been talking about and, because kind of the numbers person, how do we want to think about measuring it, how does measurement change, how do feedback loops change, how can we improve the experience throughout and then kind of drive that change through an organization in ways that are meaningful and impactful and faster than they've been before.

Lenny RachitskyNice job, Google, getting Nicole. What a win. I need to get some more Google stock ASAP. Okay, two follow-up questions. Where can folks find you online and find your book online if they want to dig deeper? And how can listeners be useful to you?

Nicole ForsgrenOnline, you can find the book at developerexperiencebook.com, I'm at nicolefv.com, and LinkedIn occasionally. Sometimes it's a mess. I try to wade through all of the noise. I get there to be useful, sign up for the book and the workbooks. The workbooks are free. I'd love to get any kind of feedback on what works, what doesn't. I always love hearing those kind of stories.

Lenny RachitskyNicole, thank you so much for being here.

Nicole ForsgrenThanks for having me, Lenny.

Lenny RachitskyMy pleasure. Thanks, again. Bye, everyone.

Thank you so much for listening. If you found this valuable, you can subscribe to the show on Apple Podcasts, Spotify, or your favorite podcast app. Also, please consider giving us a rating or leaving a review as that really helps other listeners find the podcast. You can find all past episodes or learn more about the show at lennyspodcast.com. See you in the next episode.

章节 02 / 08

第02节

中文 译稿已完成

Lenny Rachitsky很多公司都在想办法衡量团队的生产力。

Nicole Forsgren大多数生产力指标都不靠谱。如果目标只是更多代码行,我随便一提示,就能让系统写出史上最长的一段代码。太容易被钻空子了。

Lenny Rachitsky我怎么知道我的工程团队够不够快?还能不能再快一点?还是说,他们其实没发挥到最好?

Nicole Forsgren大多数团队都还能更快。但更快是为了什么?我们完全可以天天更快地交付垃圾。关键是要有策略,还要做出足够聪明的决定,知道到底该发什么。

Lenny RachitskyAI 最大的问题之一,大概就是我们得学会该多信任它生成的代码。

Nicole Forsgren我们不能只是输入一条命令,随便拿个结果回来就接受。我们得认真评估:有没有幻觉?可靠性怎么样?它的风格是不是跟我们平时会写的差不多?

Lenny Rachitsky现在很多时间都要花在审代码上,而不是写代码上。

Nicole Forsgren这其实带来了一个很大的机会,不只是重想工作流,还能重想我们的日程怎么排、工作怎么安排。现在我们也可以让 45 分钟的工作块变得有用,因为进入心流这件事,在某种程度上已经交给机器了;或者说,机器可以通过提醒上下文、生成系统示意图,帮我们重新回到心流里。

Lenny Rachitsky对于工程团队、产品团队来说,这周或下周能做的一件最有用的事是什么,让他们多做点事?

Nicole Forsgren说实话,我觉得最好的办法就是去跟人聊,认真听。

Lenny Rachitsky今天的嘉宾是 Nicole Forsgren。围绕 AI 如何提高开发者生产力,大家越来越想知道:怎么衡量这部分提升?这些工具到底是在帮忙,还是在拖累开发者?Nicole 是这个领域最早、也最深的人之一。她创建了衡量开发者体验最常用的框架 DORA 和 SPACE,写了这个领域里最重要的书《Accelerate》,新书《Frictionless》也马上出版。这本书会给你一套方法,帮助团队在这个新的 AI 时代更快、更有效地做事。她的核心观点是,AI 确实加速了编码,但开发者的提速并没有你想的那么多,因为构建失败、工具和流程不稳定,以及各种新的瓶颈,还是会把速度卡住。

本期由 Mercury 赞助。我已经用 Mercury 好几年了,说实话,换过来之后我很难再回去。之前从 Chase 换到 Mercury,差别真的很大:发电汇、看支出、给团队成员开权限去动钱,都特别顺手。相比很多传统银行网站和 App 那种笨重、难用的体验,Mercury 的设计非常清爽。它把你用钱的各种方式都整合到一个产品里,包括信用卡、开票、账单支付、团队报销和资金管理。不管你是要给承包商付款、让闲置现金生息的初创公司,还是要开票、管账期的代理公司,或者需要盯现金流、获取资金的电商品牌,Mercury 都能按你的业务场景来配置。
想看看有超过 20 万创业者都在用的 Mercury 是什么体验,可以去 mercury.com,10 分钟就能在线申请。Mercury 不是银行,银行服务由 FDIC 保险的合作银行提供,详情见 show notes。这里有个小问题:OpenAI、Cursor、Perplexity、Vercel、FLAN,还有几百家赢家公司,有什么共同点?答案是,它们都在用今天的赞助商 WorkOS。做企业软件的人大概都体验过单点登录、SCIM、RBAC、审计日志这些大客户必需功能的集成痛苦。WorkOS 把这些原本会卡住成交的东西,变成了可以直接接入的 API,专门为 B2B SaaS 打造了现代开发者平台。
不管你是想拿下第一家企业客户的早期创业公司,还是正在全球扩张的独角兽,WorkOS 都是让你快速具备 enterprise-ready 能力、打开增长的最快路径。你也可以把它理解成企业功能领域的 Stripe。去 workos.com 就能开始,或者直接找他们的 Slack 支持,他们里面真有工程师,回复特别快。WorkOS 让你用好用的 API、完整的文档和顺滑的开发体验,像最强的团队那样开发。今天就去 workos.com,让你的应用更适合企业级使用。
Nicole,非常感谢你来,欢迎来到播客。

Nicole Forsgren谢谢,能来这里我太开心了。

Lenny Rachitsky很高兴你又回来了。我刚刚还在看我们两年半前录的第一期。看着看着我都震惊了,也没那么震惊,因为那期几乎都没怎么聊 AI。那期叫《如何衡量并提升开发者生产力》,我们直到快一个小时才碰到 AI,然后还在说:“嗯,AI 和生产力会发生什么呢?”你不觉得这事很神奇吗?

Nicole Forsgren是啊。那时候它才刚刚进入大众视野,大家都在疯狂讨论它,但同时很多事情其实并没有变。很多东西还是很重要,很多东西还是一样。对了,两年半过去了这件事也有点离谱。时间都去哪儿了?时间就是一种社会建构嘛。

Lenny Rachitsky对。我们当时大部分对话都在问这种问题:“这会怎么影响人?我们会怎么改变做产品的方式?”那会儿它还只是个苗头。现在,我想工程生产力这个话题里,大家最想聊的可能就是 AI 了。今天我们会把大量时间放在这上面。之所以特别期待这次对谈,是因为感觉有太多钱被砸进了能提升生产力的 AI 工具里,而全球增长最快的公司也正是这些工程 AI 工具。现在越来越多人开始问一个问题:我们到底从这里得到了什么收益?这真的有多大帮助?我们要怎么变得更高效?

你在这个世界的中心待得比任何人都久,也发明了很多大家现在还在用的框架。所以我特别期待你回来聊这些。我想先从 DevEx 这个词开始,这个词在整个领域里出现得非常多,我们今天也会反复提到。你能不能先解释一下,DevEx 到底是什么?
DevEx 就是 developer experience,也就是开发者体验。我们说开发者体验,讲的其实就是一个开发者在日常构建软件时的感受。包括他们会遇到什么摩擦,要走哪些流程,能得到什么支持。它很重要,因为一旦 DevEx 很差,别的东西基本都救不了。再好的流程、再好的工具、再多的魔法,只要 DevEx 不行,很多事情都会卡住。
在 DevEx 里还有生产力,我觉得你和这个领域里很多人的关键洞见,不只是生产力本身,还包括工程师的幸福感。后面我们会聊很多这些内容,不过我先想请你说说……不只是生产力,工程师在一家公司里成功,还有更广的组成部分。
对,我很喜欢你这个点,因为先说生产力本来就很难定义。如果你只看产出,很多不同方式都能把结果做出来。但如果这些方式特别费劲、摩擦特别大,开发者迟早会 burnout。或者认知负荷太高,连自己在做什么都很难想清楚,因为脑子都被各种机械性的、管道式的细节占满了,那就没有足够的脑力去想真正有创造性的解决方案和问题。所以我特别喜欢这个自我强化的循环:你做更多工作,也能把工作做得更好。对人更好,对系统更好,对客户也更好。
我本来想后面再讲,但我现在就想先聊这个:工程师的心流状态。我早年其实也是工程师,学的是计算机,做了 10 年工程师。对我来说,工作里最棒的部分就是写代码、做东西时进入那种心流,整个过程会特别有趣。现在 AI 在很多方面反而让这件事更难了,因为你要和各种 agent 一起工作,还有很多代码像是它替你写好的。你聊聊心流状态对开发者、幸福感、开发者生产力到底有多重要,以及你看到 AI 在这方面带来了什么影响。
关于 DevEx,可以从很多角度讲。其中一种方式,是把它拆成三个关键部分,它们各自都很重要,而且彼此之间还会互相强化。心流状态是其中之一,认知负荷是另一个,再加上反馈循环。我觉得你刚才问心流这个点问得特别好。我也得承认,这件事我们才刚摸了几年,还在摸索在这个新环境下,对人来说最佳的心流条件和认知要求到底是什么。因为你说得对,很多时候我们会一直被打断。你不会像以前那样,一头扎进心流里,锁死,然后不停地写一大堆代码、敲一大堆字。现在更多时候,你是在创建 prompt、拿回一些代码、审这些代码,再尝试把系统里正在发生的事情整合起来,这中间真的很容易被打断。

English No English text found
No English transcript text was found for this chapter.
章节 03 / 08

第03节

中文 译稿已完成

Nicole Forsgren不过,另一方面,它也确实能帮你进入心流。比如我见过一些资深工程师搭出非常厉害的工具链,他们已经摸清了怎么把节奏一直保持住。快速反馈循环对他们特别有效。他们会把不同部分分派给不同 agent。这样他们就能一直待在“目标是什么、要完成哪些部分、怎样最快到达、然后再退出来整体评估、再回去修几处”这个状态里,而不是盯着细枝末节逐行写代码。

Lenny Rachitsky你能不能再说说这个工程师的工作流,它到底是什么样的?

Nicole Forsgren我跟其中一些人聊过,也看他们实际怎么工作。我自己还没搭过,已经列在待办清单里了。他们能搭出一个非常厉害的工作区和工作流。现在我们很多人只是随手玩工具,输入一个 prompt,出来几行,或者出来一整个程序。可他们会先把目标喂给系统:我要做什么,这个东西需要哪些基础架构组件,需要什么技术栈,要遵循什么大致流程,先帮我把这些想明白。系统就会先帮他把方案设计出来。然后每个部分再分配给一个 agent 并行去做,同时一开始就告诉它们:这些部分最后得能协同工作,架构要对,API 和约定也要用对。

然后最后让它跑几分钟。他们自己去想别的更有意思、或者一看就知道会很棘手的问题,回来时拿到的东西,通常已经比纯 vibe coding 好不少了。因为前面做得非常系统,所以最后更接近生产代码的样子。

Lenny Rachitsky所以我听下来,你是在说,先花一点时间做规划,像这些 AI 工程师一样把路铺好,而不是一路猛冲、边做边想。

Nicole Forsgren对。

Lenny Rachitsky好,那我问一个很多人心里都很核心的问题。很多公司都在衡量团队生产力:这个会不会提升我们的生产力?会不会拖慢我们?那我先问,你觉得大家现在在用 AI 衡量生产力增益时,最常做错的是什么?

Nicole Forsgren我先说,大多数生产力指标都是骗人的。这个问题很棘手,因为从历史上看……代码行数一直都不是好指标,但很多人还是拿它来当代码量、产出或者复杂度的替代指标。现在对很多系统来说,过去还能半遮半掩地用代码行数衡量的那套,已经完全失效了,因为“代码行数”到底是什么意思?如果目标是更多代码行,我可以随便让系统写一大段史上最长的代码,再加上一堆注释。我们都知道,agent 和 LLM 天生就很啰嗦,所以特别容易把这个指标玩坏,还把复杂度和技术债一股脑塞进所有工作里。不过也不是完全没法看,还是有些东西值得留意。代码行数本来就不是很好的生产力指标,甚至可以说挺差的;但如果我们能分清哪些代码来自人、哪些来自 AI,它就变得更相关了,因为这样我们就能回答下游问题。

比如,代码的存活率是多少?我们的代码质量怎么样?我们的代码会不会又被喂回训练系统?如果这些代码之后还会再训练别的系统,尤其是我们在做 fine-tuning 或本地调优的时候,其中有多少是机器生成的?这会制造什么样的循环?会不会无意中引入某些模式或偏差?所以一方面,它不是很好的生产力指标;另一方面,它又确实有用。我甚至对 DORA 也是这么看。我做过 DORA 指标,像速度指标、稳定性指标之类。如果你只看这些,已经不够用了,因为 AI 已经改变了我们理解反馈循环的方式。现在反馈必须更快。DORA 原本是用来评估整个管道的速度和稳定性,这点仍然成立。但我们不能再像以前那样盲目套用老指标,因为这样会漏掉一些非常重要的现象,也会漏掉人们工作方式的变化。
有意思。你发明了 DORA,它长期以来几乎就是大家衡量生产力的主要框架。后来又有 SPACE、Core 4,可能还有别的。所以我听你这意思是,这些框架在 AI 时代都已经有点过时了,因为 AI 现在会贡献很大一部分代码。

Nicole Forsgren我会说,如果它是一个规定性指标,那它就必须只按它规定的方式使用。

Lenny Rachitsky所以——

Nicole ForsgrenDORA 4 有四个关键指标。两个是速度指标:部署频率和 lead time,也就是从代码提交到上线。还有两个是稳定性指标:MTTR 和变更失败率。如果这些指标是拿来评估流水线速度和整体表现,那当然很好。如果你想用它们来理解……因为这里面其实隐含了反馈循环,对吧,以前你主要是从客户那里拿反馈。但现在用 AI 不能再这么盲用,因为我们的反馈循环早得多,而且不只是本地构建和测试阶段。整个流水线里处处都有反馈循环,有时甚至在流水线中段,我们都希望把它们用起来,而不是像以前那样没那么重视。我不是说以前做不到,只是大家那时候没怎么关注这块。

所以说,这些是规定性指标。而 SPACE 是个框架,它不会直接告诉你该用哪个指标。很多人会因此很沮丧,因为我没告诉他们到底该量什么。但我现在反而觉得,这正是它的力量。我们其实已经看到,SPACE 在 AI 这种新兴场景里也相当适用,因为我们仍然要看 satisfaction、performance、activity、communication and collaboration、efficiency and flow。它是个缩写。我们还是要看满意度。还是要看性能,也就是结果是什么。还是要看活动。是的,在某些场景里,代码行数、PR 数量之类可以反映一些东西,或者告警数、某类动作的次数,都能算活动或计数。沟通和协作也非常重要,因为这既是系统之间怎么交流,也是人和人怎么交流。比如,有多少工作被丢给了聊天机器人,而不是去问团队里的资深工程师?多不一定更好,少也不一定更好,要看情况。
然后是效率和心流:人能不能进入心流?做一件事要花多久?我们的系统里,事情是怎么流动的?在这儿,我还会再加几个维度。跟一些早期作者聊的时候,我会特别提 trust。不是说以前信任不重要,而是现在它已经变成非常非常前置的问题了。对吧?在你写代码之前,如果编译能过,通常就没事了,事情就是这样。可 LLM 是非确定性的。现在我们不能只是输入一条命令,随便拿到一个结果就接受。我们得认真评估:有没有幻觉?可靠性怎么样?它的风格是不是我们平时会写的?如果不符合,那是不是也可以接受?所以这取决于……还是那句话,规定性。你得确保它是拿来做合适用途的,对吧?
我们等会儿会聊你现在觉得这类事情最好的做法。你有本书马上要出来,里面会讲怎么把这件事做好,我们会聊到那本书。上次我们聊天时,我记得你特别强调过一个问题,就是 AI 可能带来的最大麻烦之一会是信任,要理解、也要学会信任多少它生成的代码。你两年半前就说过,现在很多时间都会花在审代码而不是写代码上。我听到的就是这个。

Nicole Forsgren我觉得接下来值得观察的是,这会怎样影响我们以后组织工作的方式。我们刚才聊到心流和认知负荷。现在我们的注意力必须在特定时间点切到特定事情上,工作也被切碎了,这和过去的工作方式不一样了。我觉得这里面有个很大的机会,不只是重想工作流,还要重想我们的一天怎么排、工作的结构怎么排。

Lenny Rachitsky能多说一点吗?你具体在想什么?你觉得会发生什么?你觉得未来会往哪走?你看到什么东西是有效的?

Nicole Forsgren这部分纯属推测。比如 Gloria Mark 在注意力和深度工作方面做了很多很好的研究,人类一天能拿出大概四个小时的高质量深度工作时间。差不多就到头了。

Lenny Rachitsky对,我太有感觉了。

Nicole Forsgren大致上这就是上限了。我知道肯定有人会说:“我超人,我可以——”

Lenny Rachitsky那要是你吃 20 克肌酸呢?

Nicole Forsgren对,那要是我们微剂量呢?

Lenny Rachitsky对,没错。

Nicole Forsgren是啊。那既然我们知道自己大概只有四个小时高质量深度工作的能力……很多人应该都有这种体验,对吧。某些时段状态特别好,可能是早上,也可能是下午。然后你会进入一种状态:我得去清 inbox 了,因为我现在也就只能做这个。我能正常工作,但我已经不可能做出最好的创新、解决问题、写作或者编码工作了。很多时候,想进入这种状态,就得靠很长一段时间才能进到 flow,才能拿到那种深度工作。通常我自己大概是两个小时左右。一个小时会有点尴尬,因为进入这个状态本身就要时间。好,回头看三年前、三年半前的“老日子”,我们可以把四个小时空出来,然后大概率能做出两三个小时真正高质量的工作。因为那时候我们就是专注的,对吧,几乎没有打扰,最多也只是少量打扰。

现在,写代码和做系统本身就是一种中断驱动的事情,或者至少充满中断,因为你刚开始做点什么,它就会插进来。所以我们该怎么理解这件事?这是不是意味着四小时的工作块依然有用?大概有用。但这是不是也意味着,我们现在可以把 45 分钟的工作块也变得有用?因为进入心流这件事,在某种程度上已经交给机器了;或者机器可以通过提醒上下文、生成系统图之类的方式,帮我们重新回到心流里。我觉得这是个特别有意思的方向,问题很多,机会也很多。拜托大家去做这个研究,然后回来告诉我,因为……它未必会出现在我自己的问题清单里,但这真的是个很棒的问题。

English No English text found
No English transcript text was found for this chapter.
章节 04 / 08

第04节

中文 译稿已完成

Lenny Rachitsky这太有意思了。实际上,每个工程师都快变成 EM 了,也就是工程经理,要去协调这些“年纪很小”的 AI 工程师。所以你的意思是,即便你有一个 30 小时的时间块,也能深扎进代码里,同时把那些跑去做任务的 AI 工程师都解锁出来。另外,这些工具还会提醒你:你刚才做到哪了。好,你可以直接跳回这段代码,顺手改几处。

Nicole Forsgren对。

Lenny Rachitsky太有意思了。让我稍微拉远一点聊……在我们进入你关于 developer experience 的框架、以及你最新的思路之前,先不说“工程师做得更多”这件事本身,为什么公司真的应该把 developer experience 放在特别重要的位置?你最好的说服理由是什么?

Nicole Forsgren我不太想直接说投资回报率,但这里的商业价值……机会真的非常大。一般来说,我们写软件,有一部分是为了好玩、为了兴趣,但我们也写软件,因为它要满足业务需求。它能帮我们拿市场份额,帮我们吸引并留住客户,帮我们做到这些事。我觉得 DevEx 很重要,是因为它让所有这些软件创建都成为可能,也让所有问题解决成为可能。它还能让我们和客户做超快速实验。以前做原型得花一阵子,真要在生产系统上跑 A/B 测试,通常还要更久;现在,几个小时就能做完。

Lenny Rachitsky那我们换个更战术一点的角度。在进入更大的框架之前,你觉得工程团队、产品团队这周或者下周能做的一件最有用的事是什么,能帮他们提升开发者体验、也许让他们多做点事?

Nicole Forsgren说实话,我觉得你们最该做的,就是去找人聊,认真听。我很喜欢这个播客的受众主要是 PM,因为他们通常真的很擅长这件事。我会建议先从倾听开始,而不是从工具和自动化开始。很多公司一上来就说:“好,我来做个工具”,或者“我来搭个东西。”但你最后做出来的,往往是你自己曾经遇到过的困难,或者特别容易自动化的地方。如果你只是去跟人聊,问开发者:“想想昨天,你昨天都做了什么?带我走一遍。哪些地方让你特别开心?哪些地方特别难?你在哪儿烦了?哪儿慢了?哪儿有摩擦?”你去跟几个人聊,常常就能挖出一些投入不大、但影响很大的问题,或者识别出某个流程其实没必要那么复杂、那么慢。

Lenny Rachitsky所以我听下来,你的意思几乎就是:你想让团队更快、也更开心,那在做任何事之前,先去问他们到底什么地方让他们不爽。

Nicole Forsgren对,先去问。相信我,大多数开发者都非常愿意告诉你哪里坏了、哪里不对。我记得我之前合作过一家公司,他们有个流程特别难,而且还跑在一套老 mainframe 系统上,他们原本以为要把整套东西重搭一遍,所以一直没去碰、也没去谈。每个人都讨厌它,因为这个流程把事情拖得特别久。其实他们要做的,只是改一个流程。很多时候,你要做的真的只是改流程。最后他们改成了不用再……我记得当时是有人得把它打印出来,拿着文件走三四层楼去审批;然后另一个人再把它拿回来。整个中间环节就这么没了。他们没有重搭任何东西,也没有重做什么大架构,只是发了一封邮件。

Lenny Rachitsky我想顺着这个继续问。你刚才说的让我很好奇,大家最常做的事情到底是什么?如果你刚开始做,“好,我们得重视 engineering experience 了”,你会发现公司最常需要补的两三件事是什么?

Nicole Forsgren我会说,刚才那个流程问题几乎永远都能改,而且通常不需要太多工程投入,也不需要很多工程人头。尤其是大公司,常常会有某个流程层层叠叠,几乎是好几步、好几步、再好几步。它之所以存在,只是因为一直都这么存在,但现在已经不是那回事了。小公司有时候也会太 YOLO,你都不知道到底发生了什么,只能追着大家跑。所以如果你能做一个很轻量的流程,通常就会很有帮助。这往往是最好的起点之一,尤其是当你对整个组织其他部分的触达还比较有限的时候。很多时候,先从团队内部的一个流程下手就能起作用。

从业务负责人的角度看,你能做的很多事情,其实是在给这类组织变革提供结构和支持。把你在做什么说清楚,把优先级说清楚,把为什么这件事重要说清楚,然后把赢的部分庆祝出来。因为如果大家做这件事,只是一个一次性的、完全隔离的副项目,要把势能做起来真的很难,也很难让人真正关心、持续投入。它会让人感觉这只是又一个内部项目,好像不会有什么影响,也不会被庆祝。但它对业务的上限其实非常高。

Lenny Rachitsky有意思。我听下来,你讲的几乎没有工具或技术。不是“迁到这朵云”,也不是“装这个新的部署系统”,而是流程、人、组织,还有士气。

Nicole Forsgren对,不过技术部分当然也很重要,尤其是现在有 AI 之后,我们要重新思考构建和测试系统怎么工作。我们也在重新思考怎么把反馈给到用户,让“分享什么、什么时候分享”都变得非常定制化。这里面确实有很多技术工作,但那不是全部。它是必要条件,但不充分,而且也不一定非得从这里开始。

Lenny Rachitsky我想问你一个挺难的问题,这是我听你讲的时候突然想到的。我感觉这也是大多数创始人、负责人都会想的问题。问题就是:我怎么知道我的工程团队跑得够不够快?他们还能不能更快?还是说,他们其实已经发挥到极限了?有没有什么味道、什么信号能告诉你,“对,我的团队应该更快”,或者“不是,他们就只能这么快了”?

Nicole Forsgren大多数团队都还能更快,对吧。与此同时,考虑到我们对认知负荷的理解,并不是所有提速都一定是好事。或者说,一旦到了某个点,收益就会开始变得有限,而大多数人离那个点还远着呢。坦白说,我几乎见不到哪支团队已经到了那个点。那你怎么知道?如果你老是听到构建失败、测试不稳定、流程过长,如果你得去申请新系统,或者得去开一个新环境,或者切换任务、切换项目特别特别难,那基本就是信号了。还有,如果有人有机会去组织里的别的地方工作,但因为一些说不清、也不是政治性的原因没去,而且一提起系统大家都很有意见,那通常就说明哪里有摩擦。

因为一旦你终于摸明白自己的系统、也能把活干起来了,切换到别的地方的成本常常会高得离谱。所以有时候人们就会那样做。我合作过一些公司,在公司内部换组织,付出的代价几乎跟新员工一样,因为系统差异太大,摩擦又太多,很多事情都特别难做。

Lenny Rachitsky我尤其喜欢你答案的第一部分,你说“你总是还能更快”。我觉得每个创始人听到这个都会很开心。不过按你的意思,时间久了会有递减收益?

Nicole Forsgren对,而且你并不知道质量怎么样,对吧。所以我觉得另一面就在这儿:你总是可以更快,但更快是为了什么?我们是不是在做正确的商业决策?这也是 PM 尤其该介入的地方。我们可以把垃圾更快地发出去,每天都能这样。我们需要的是策略和足够聪明的决策,来决定到底该发什么、该实验什么、功能应该按什么顺序做、怎么 rollout。策略才是核心,然后再去想怎么提速。如果其他环节没做好,那就是 garbage in, garbage out。

我想顺着你这条线继续,但在那之前,先复述一下你刚才说的内容。你是在说,如果团队总是遇到构建失败,那就是一个信号;如果测试老是 flaky、false positive 一堆,也是信号;如果不同项目之间很难切换上下文,也是信号;如果大家一提到系统就说“这玩意儿真的太难用了”,也是信号。大概是这个意思吧?

Nicole Forsgren对。

Lenny Rachitsky好。那回到你刚刚说的那个点。现在有一种感觉,AI 让团队快了很多,因为它替他们写了好多代码。你会有很多异步 agent、很多工程师在给你干活。可你的核心意思似乎是,这只是工程工作的一个部分,别的部分还有很多,比如先搞清楚到底要做什么、内部怎么对齐。你能不能再展开讲讲:AI 确实有很多机会去提升工程效率,但也有很多其他环节,其实并不会因为 AI 就变好?

Nicole Forsgren会啊,或者未来可能会,对吧。

Lenny Rachitsky嗯。

Nicole Forsgren我觉得我们其实有很多方式可以把 AI 工具拉进来,帮我们打磨策略、打磨信息表达、思考实验方法或者实验目标,或者思考我们的可服务市场到底有多大。但前提是,策略和计划本身得已经相对对齐,或者至少得有两三种备选方案可以测试。因为现在工程实现,尤其是原型阶段,能快得多了。我们可以直接扔出原型。只要基础设施已经在那儿,我们就能跑各种面向客户的测试和实验,这让我们能比以前快得多地学习和推进。以前有些地方,把东西推到生产里做 A/B 测试、拿反馈,可能得花几个月;现在一两天就能做完,肯定不会超过一周。但我们得确保测试的是对的东西,得确认我们是不是在和对的人合作、是不是具备需要的数据。

AI 在这里其实也能是个不错的伙伴,只要你跟它好好聊一聊,再去问你们的专家:我该看什么类型的数据?我需要什么样的埋点?我能做什么样的分析?这样你再去找数据科学团队,就能更像样地说:“我打算这么做,我想先……”我们不要随便 YOLO 式地做 A/B 测试,因为那很可能会……搞个大测试,结果把用户或者客户扰乱了,或者把隐私、安全协议搞坏了,最后还拿到一堆没法用的数据,因为你根本抓不到你想要的信号。现在我看到的情况是,大家能把这件事从几周压缩到几天。这样一来,他们就能在更知情、也更完整的状态下,开始那些关键利益相关方的讨论。

English No English text found
No English transcript text was found for this chapter.
章节 05 / 08

第05节

中文 译稿已完成

Lenny Rachitsky本期节目由 Coda 赞助。我自己几乎每天都在用 Coda 管播客和社群:要问嘉宾的问题、社群资源、工作流,都是放在里面。它最适合那种项目一开始就想得很清楚的场景,你知道每个人负责什么,也知道要去哪里找自己需要的数据。其实你不该把时间浪费在到处翻找上,因为项目追踪、OKR、文档、表格都能放进 Coda 的同一个 tab 里。借助 Coda 这个协作式一体化工作区,你既有文档的灵活性、表格的结构、应用的能力,还有 AI 的智能,全都装在一个很好整理的 tab 里。

像我刚才说的,我几乎每天都在用 Coda。现在已经有 5 万多个团队在用它,让协作更统一、也更聚焦。如果你是想提升对齐和敏捷性的初创团队,Coda 能帮你从规划快速推进到执行。想自己试试,可以去 coda.io/lenny,今天就能开始,初创团队版可免费用 6 个月。地址是 coda.io/lenny。
我很喜欢你和很多不同公司、不同类型的业务都有合作。能看到很多地方内部到底发生了什么的人其实很少。你现在看到的 AI 带来的生产力提升,大概有多大?你觉得增益到底有多少?

Nicole Forsgren我会说,这个提升是真的,但我们现在还没有特别好的衡量方式。我们还在想到底该量什么、该怎么量。最好的一个指标,大概还是端到端的 velocity,也就是一项功能、一个产品,或者某个东西,在整个系统里流动得有多快,这样你才能再去做实验,或者从想法一路推进到最终结果,甚至只是把一个 feature 或一个片段推过系统,方便测试。这一点很有价值。但它也很难直接跟某个开发者手里的某个 AI 工具一一对应。不过我确实也看到了别的一些现象,还是那种快速原型开发。

我其实很讨厌代码行数,但我还是要拿它来说。我们确实看到……我和一些人合作过,他们在看一整批公司时发现,AI 给 नियमित使用者生成的代码明显更多了。然后他们还发现,对于那些经常使用 AI 编码环境,也就是 AI ADE 的人来说,这个工具本身会先给他们更多代码;而工程师自己产出的增量,甚至是这个编码 agent 给出的两倍。所以我会说,这更像是一个次级信号,或者说是一个侧面结果、一个味道:它能帮你解卡,能加速你本来就会做的工作。我自己工作的时候也经常这样,最开始几分钟总是最难启动,但一旦启动了,我就能进入状态。所以它们特别擅长把你从卡住的地方拉出来,把你的工作打开。

Lenny Rachitsky我在 Twitter 上看到很多人分享的一个点是,OpenAI Codex 特别擅长找那种很棘手的 bug。我记得 Karpathy 也分享过。他被一个 bug 卡住了很久,任何 AI 工具都没找出来。后来最新版的 Codex 花了一个小时左右去研究,最后把它找到了。

Nicole Forsgren对,我也听到很多这种惊人的故事。还有写单元测试、跑单元测试、写文档、整理文档这些事。因为现在很多人会觉得:“哦,反正我们已经有 agent 了,我不用读文档了,代码都在那儿。”但结果是,agent 其实也依赖好数据,因为它到底学了什么、被什么 grounding 住,决定了它的表现。更好的数据会带来更好的结果,而这些数据里就包括文档和注释。你的文档越好、注释越好,AI 工具的表现通常也越好。

Lenny RachitskyAI 还能帮你写这些文档。我最近也在稍微用 Devin,它在这方面真的挺强的。

Nicole Forsgren对。

Lenny Rachitsky好,我们来聊聊这个框架,这本书。你要出一本叫 Frictionless 的书,光听名字就像个梦,“怎么做出一个几乎没有摩擦的 dev team?”这本书叫《Frictionless: 7 Steps to Remove Barriers, Unlock Value, and Outpace Your Competition in the Age of AI》。这里面有一个七步流程。你给我们讲讲这本书,也顺便说说它是给谁看的、解决什么问题,然后再把这七步讲一下。

Nicole Forsgren我也要说,这本书还是和 Abi Noda 一起写的,他……是 DX 的那位。这个领域里他的经验非常丰富,服务过几百家公司,所以能跟他不断碰撞想法真的很不错。也要感谢我们采访过的工程负责人、DevEx 负责人、CTO 和工程师,帮我们确认这些直觉是不是对的。那这本书是给谁看的?

Lenny Rachitsky我先稍微岔开一下,聊聊 Abi 和 DX,因为你刚提到他,这个话题和我们今天聊的东西真的特别契合。Abi 创办了 DX 这家公司,名字就很妙,正适合做 developer experience。后来他们把公司卖给了 Atlassian,价格是 10 亿美元,而且 ARR 倍数也非常高。对我来说,这恰好说明了这场对话为什么这么有价值,也说明公司愿意在提升 developer experience 上投入多少。Atlassian 愿意为这件事花 10 亿美元。那是个早期阶段还不错的公司,大家也很喜欢它,但它当时就是早期阶段,结果估值都到 10 亿了。核心逻辑是,他们面向很多使用 Jira 和其他产品的公司,而这些公司都在想:我们到底怎么衡量生产力?对他们来说,这件事值很多钱。我知道你也是他们很早的顾问,所以——

Nicole Forsgren对。

Lenny Rachitsky……这就说明这件事到底有多重要。

Nicole Forsgren对。我觉得这也说明了,从这件事里你能挖出多大的价值。这里有太多低垂的果实,也有太多尚未释放的潜力,而且很多时候你根本不知道该从哪开始。哪怕是在大公司里,我也待过一些,里面有很多很有经验、也很聪明的人。但如果你之前没有从这个角度思考过,很难知道从哪开始;或者一开始就很容易犯一些简单的错,结果后面又得重来。所以这也回到那个问题,这本书是给谁看的?它是给所有关心 DevEx 的人看的,尤其是技术负责人、任何想启动 DevEx 项目的人,或者已经在做 DevEx 改进项目的人。我觉得它对 PM 尤其相关,因为如果你在 PM 一个涉及软件构建、涉及软件创造的东西,提升 DevEx 只会帮到你的团队。更重要的是,PM 通常具备一些非常关键的技能、洞察和直觉,而这些恰恰是 DevEx 里很重要、但很多工程团队会漏掉的。

Lenny Rachitsky好。框架是什么?步骤是什么?大家从哪开始?

Nicole Forsgren这本书走的是一个七步流程,最后还会给出一些核心原则。第一步是启动旅程。假设你刚起步,就可以从这里开始。这一步包括我们前面聊过的东西:去跟人聊,做一轮 listening tour,把你学到的东西梳理出来,把工作流和工具可视化,搞清楚当前状态是什么。第二步是拿一个 quick win。先从小处开始,先拿一个快赢,挑对项目,然后把你做了什么分享出去。第三步是用数据来优化工作。要先把数据基础搭起来,找到现有数据,开始采集新数据,再用一些问卷快速拿到洞察,书里也会给示例问卷。第四步是决定策略和优先级。等你有了数据,就得知道所有可能坏掉的东西里,哪些最重要。要是你已经拿到了 quick win,剩下的事情里,“接下来我该做什么?”这一步里我们会走一些评估框架。

第五步是把你的策略卖出去。你既然已经决定了,就得去说服其他人。现在你需要拿反馈,也需要讲清楚为什么这才是眼下最正确的策略。第六步是在你的规模上推动变革。这里我们会分别照顾那些只在本地有控制权的人。如果你只是从一个 dev team 起步,那就要自己动手,做 grassroots 式推进;如果你是 developer experience 的 VP 或类似角色,你也有一些自上而下可以借力的东西;而如果你处在中间位置,就要思考怎么同时借助这两种策略。第七步是评估你的进展、展示价值,然后再循环回去。
我也想说,我们写这本书的时候,目标就是让你无论现在在哪一步,都能直接从那里切进去。如果你正在启动一个团队或一项倡议,最好还是从第一步开始,你真的应该从第一步开始。如果你是在加入一个已经存在的项目,那你可以直接跳到确定优先级,或者执行变更这一步。所以这就是七步流程。除此之外,我们也推荐一些实践,包括怎么配资源、怎么做变更管理、怎么让技术保持可持续,以及怎么把 PM 的视角带进来:如何把 developer experience 当成一个产品来看,如何把我们拥有的指标也当成产品来看。
太好了。好,我有几个问题。先把书的地址告诉大家。网址是什么?怎么拿到?什么时候出?

Nicole Forsgren可以,developerexperiencebook.com。现在就能订阅邮件列表,等它开放预售时我们会通知你,也会分享工作手册里的部分内容。我们配套还做了一个接近一百页的 workbook,和书一起用,预计年底前出版。

Lenny Rachitsky好,所以这件事里有一个很重要的点,就是“developer experience”这个词其实是经过深思熟虑的,它不是 developer productivity,也不是 developer work。它是在问:我们怎么让公司里的开发者体验更好,这当然也包括他们能做更多事,但还包括他们更开心、状态更好之类。所以这确实是个很重要的区别,对吧?

Nicole Forsgren对,完全是这样。

English No English text found
No English transcript text was found for this chapter.
章节 06 / 08

第06节

中文 译稿已完成

Lenny Rachitsky好。

Nicole Forsgren因为归根到底,这不只是生产力的问题。我们刚才其实是从“要做对的事”这个角度在聊。你当然想要高效率,但你还得想清楚……工程师最厉害的地方之一就是,你把问题丢给他们,不告诉他们怎么做,他们往往能做得更好。他们有自由、有创新,也有创造力,能把这个问题真正解开。如果只看生产力,那就只剩代码行数、PR 数量,或者别的什么指标了。但我们真正想谈的是价值,以及怎么把价值释放出来、怎么更快拿到价值。这当然也包括让他们更高效、减少摩擦,因为这样他们才会有心流,认知负荷也会下降,这就是我们前面聊到的那些东西。

Lenny Rachitsky太棒了。好,那如果有人想从头开始做这个团队,通常会是什么样?我记得在 Airbnb 的时候,这支团队是慢慢成形的,最开始就是一两个工程师先把它带起来。你会怎么建议先设一个试点团队,后面又怎么扩展?

Nicole Forsgren做法有好几种,对吧?如果是从自己内部开始做,可以先配一两个工程师,再加一个 PM、PGM 或 TPM,帮忙做沟通。因为这件事里,沟通方案真的特别重要。规模小的时候,我们想找的是那些能快速见效的改动,找那些小范围就能做的事情。有人会把这类问题叫作 paper cuts,也就是那种看起来很小,但总在割人的小毛病。你要做的是通过这些小改动,让大家自己看到价值、感受到好处:“开发者的工作怎么能变得更好?日常工作怎么能变得更顺?”然后再慢慢把势能带起来。如果你是自上而下推动,而且手里有这个权限,那你还是要找 quick wins,只不过这些 quick wins 可以更偏全局,因为你有基础设施或者支持,可以去做不只局限在本地的变更。

比如,一个很小、很本地化的改动,就是把测试、测试套件清一清。任何团队都能做这个,任何团队都能做。放到更大范围里,可能就是改掉一个组织级别的流程,这个流程本来就过于笨重,或者多投一点资源去清理 provisioning 环境。

Lenny Rachitsky好。像这样的团队成立之后,你在工程团队里看到了什么影响?

Nicole Forsgren我得说,小公司里的影响非常大;大公司里,动辄就是几十万、几百万美元,甚至上亿美元、十亿美元级别。嗯,我们也得学会怎么把这件事讲清楚,“这笔账到底怎么算”。很多时候,我们可以看省了多少时间、省了多少成本,也可以看很多别的东西。我们可以把 speed to value 看成 speed to market,也可以看风险降低。但这些收益都是真实存在的。我想补一句,它通常会走一个类似 J 型曲线的路径。先会有几个 quick wins,看起来一下子收获很大;然后会掉进一个小坑,因为那些最显眼、最容易做的 low-hanging fruit 已经被处理掉了。接下来我们就得多做一点功课。可能要补一点基础设施,也可能要补一点 telemetry,好把我们想抓的数据抓到。等这些都补上以后,收益就会开始真正复利式放大。

Lenny Rachitsky那回到你刚才说的那个衡量问题,你会怎么建议大家去找这些数字?我觉得这件事最有力量的地方就在于可以说,“我们通过这个省下了一百万美元。”你会看什么来算出这个数?

Nicole Forsgren我觉得有几件事要记住。第一,我们的关键受众是谁,而且通常不止一个。我们当然要能跟开发者说清楚,因为他们才是这些系统的使用者;他们也会和你一起把系统做出来,或者至少给你反馈。所以对开发者来说,我们通常要用他们在意的方式来讲。比如节省时间。如果某件事更快了,他们就能省时间。以前不需要再做 setup 了,就能少花时间在这些事情上,减少琐碎劳动。合规和安全也特别重要。很多时候,这里面会有好几个手工步骤……我不是说它们没有价值。从单个开发者的视角看,它们并不增值。如果能尽量自动化,那当然最好;还有就是更好的专注时间。

这是从开发者视角来说。领导层通常也会关心这些,但他们往往更关心别的东西。所以我们可以聊美元成本,聊“能不能加速收入”,聊“我们的 time to value 怎么样”,聊“我们的速度有多快”,以及“我们多久能从客户那里拿到反馈”。对处在竞争特别激烈环境里的团队和组织来说,这会特别有说服力,因为本质上就是速度。我们也可以聊省钱。在这块,我们可以尝试把节省量量化出来。还是拿测试和构建来说,如果我们把测试和构建套件清理干净,对开发者来说,他们最想听到的是节省了多少时间,系统也更可靠了。他们少了很多苦活,因为不用一遍一遍重跑测试,也不用老去收拾测试套件。
从业务角度看,清理测试和构建套件也可能直接带来云成本节省,因为这些测试都是在云上跑的。如果它们总是失败,或者只是白白浪费开销,那就能把一部分资源回收回来。我们也可以一直聊时间和生产力收益,比如“有多少相当于开发者时间的产能,被花在了那些并不增值的事情上?”有时候,我们还可以把这些和业务结果做相关性分析,相关性通常就是我们在这里能做到的最好程度了,但比如“缩短了到达价值的时间”和“市场份额增长”之间,确实可能有很有说服力的相关关系。

Lenny Rachitsky我顺着这个问下去,再回到现在大家对 AI 和生产力最关心的那个问题。我觉得现在还没人真正有标准答案,但我很想听听你怎么看。今天大家到底该怎么做?理解 AI 工具到底在给生产力带来什么影响,最好的方法是什么?毕竟大家在花这么多钱,我不知道……我们到底得到了什么?我猜事情确实在变快,但也说不好。所以如果有人问,“好,那我现在应该先试着做什么?”你对衡量 AI 工具对生产力影响的最佳建议会是什么?

Nicole Forsgren我会说,要看情况。部分要看你的领导层到底真正关心什么。我们通常都挺擅长搞清楚开发者在意什么,也能把这些传达给他们。但如果我们现在只是想抓两三个最关键的数据点,先聚焦一下,因为刚开始做数据的时候,事情有时会有点难。那就看看他们平时怎么说。你听到的那些话里,有没有在讲市场份额?有没有在讲市场竞争力?如果是,那就盯速度。想办法抓到从功能到上线、从功能到客户、从功能到实验的速度指标,以及他们说的那个反馈回路到底长什么样。如果他们老在讲利润率,那就去找省钱的办法,然后把它转成回收回来的人力成本。有时候你改造了一个流程,就不再需要那么多供应商了,所以减少供应商支出也能帮上忙。我之所以说“也要看情况”,是因为有时候领导层会说一些话,慢慢就会成主题。如果你能解决他们正在头疼的问题,或者你能围绕他们真正关注的事情去表述,哪怕只是稍微换个说法都行。比如他们把所有事情都叫 developer productivity,那你就干脆也叫 productivity;如果他们说的是 velocity,而 velocity 才是他们最在意的,那就用 velocity 的方式来讲;如果他们说的是 transformation 或 disruption,那就想办法把它和 disruption 联系起来。因为这样他们才会有共鸣。我们不想让他们还得费劲去理解我们到底在做什么,以及我们到底创造了什么价值。

Lenny Rachitsky这个建议太好了。复述一下,你的意思是:如果公司想搞清楚 AI 工具到底给公司带来了什么影响,第一步不是技术,而是先搞清楚公司最在意什么、领导最在意什么。可能是市场份额,可能是利润率,也可能是 velocity,也可能是“我们得更快”,或者“我们得转型”。所以你的建议是,先从你听到的词和短语里把这个意思抓出来。然后再去找能衡量这些东西的方法,去量市场份额有没有涨,利润率有没有升。我很喜欢你举的这些例子,比如从功能想法到上线、或者到实验的时间,可能就先从这里开始追踪。如果看利润率,那就是少跑了多少失败测试,或者少付了哪些供应商的钱之类的。至于 velocity,我猜你会想到 DORA 这类东西,也就是工程交付、发版、shipping 的速度……你会怎么定义这部分?

Nicole Forsgren我会说,其实应该尽量把范围拉得广一点。所以如果你能从 idea 到 customer,或者 idea 到 experiment,看看这要花多久,平时要多久,现在在 AI 工具用得更好、摩擦更少之后又要多久,这就很有价值。关于归因问题,我想书里也会讲一点。到底是谁的作用?是 DevEx 还是 AI?那就老老实实说出来。比如,“是的,我们上线了 AI 工具,同时也做了 DevEx 改进,而且这两个团队配合得非常紧密。”这两边大概率都贡献了一部分,对吧?如果只有 AI 工具,没有 DevEx 改进,那当然也会有些提升,但绝对不会有现在这么大。

Lenny Rachitsky如果大家今天就开始做,比如说“我想开始衡量 developer experience”,那是不是有两三个最基础的指标,大家应该尽快开始测?

Nicole Forsgren如果你今天才开始,而且手里什么都没有,那当然先去找人聊。聊人,这是第一步。之后我会做 survey,因为 survey 能很快给你一个整体视图,让你知道最大的困难到底在哪。之所以这么说,是因为如果你刚起步,系统里可能还没有埋点,没有完整的指标体系。就算已经有了,也未必是你以为你想要的那些。没有目的地设计出来的指标,值得怀疑;为了别的目的设计的指标,也许能用,也许不能用,所以不能想当然地以为我们已经有了。那也是我喜欢 survey 的原因之一,书里也给了一个例子。你可以直接问几个问题:“你有多满意?影响你生产力最大的障碍是什么?或者,让工作推进最难的挑战是什么?”然后让大家从工具里选,也可以从流程里选,再让他们选三个,就三个。

English No English text found
No English transcript text was found for this chapter.
章节 07 / 08

第07节

中文 译稿已完成

Lenny Rachitsky在这三个里,这些问题影响你的频率有多高?是每小时、每天、每周,还是每季度?因为有些问题每天都会碰到,烦得要命。有些问题一季度才冒出来一次,可一到季末又特别折腾人。然后再加一个开放式问题,比如“还有什么我们应该知道的吗?”这会给你非常强的信号。因为一旦让大家把前 3 个问题都排出来,而不是让他们全选,数据就不会那么乱。就这 3 个,再加上频率,你甚至可以自己算一个分数,或者如果你愿意,也可以算加权分数。然后再去深挖:这些数据应该放在哪儿?我们到底还需要什么数据?但至少你已经有了某种基线。它会是主观基线,但你现在至少知道最大的挑战是什么。
我很喜欢这一点,因为所有东西最后都会回到先跟人聊、问这些问题。这和产品管理、做出好产品其实很像:你有没有真的去问你的客户?大家都觉得自己在做,但大多数人其实做得远远不够。

Nicole Forsgren还有一点挺难的,是当你开始拿数据的时候。访谈当然也是数据,而且很重要;survey 则更量化一点,因为我们可以把它变成计数。但这时候也要小心。很多人写问卷题目的时候会写成这样:“过去一周里,构建和测试系统是慢,还是复杂?”这一下其实问了四个问题。你如果回答 yes,那到底是构建的问题,还是测试的问题?是慢,还是 flaky,还是复杂?所以要把真正的信号拆出来,其实挺难的。花点时间和懂问卷设计的人聊一聊,或者直接跟 Claude、Gemini、ChatGPT 说:“这是我的问卷题目,你能不能帮我优化一下?”然后最好再过几轮。这个问题设计得好吗?我拿到的数据能回答什么问题?我能解决什么问题?如果一个问题不能被数据回答,那就别把这个数据收进来。

Lenny Rachitsky而且你书里也有示例问卷,大家完全可以直接复制粘贴,不用自己从头想。

Nicole Forsgren对,有示例问卷,也有很多示例问题。我们甚至还建议了格式、流程应该是什么样,问卷应该多长,应该多长就够了,不能多长。

Lenny Rachitsky我读到的一点是,你其实不太喜欢 happiness survey,也就是直接问工程师有多开心的那种问卷,是这样吗?如果是,为什么?

Nicole Forsgren对,不喜欢。因为影响 happiness 的东西太多了。开心这件事本来就很复杂,对吧?开心可以来自工作,也可以来自家庭、爱好、周末,还有太多太多别的东西。并不是说我不在乎 happiness,而是我觉得 happiness survey 在这里并不好用。更有帮助的是 satisfaction,可一说到这个,大家就会说:“那不就是一回事吗?”不是,因为你可以问“你对这个工具满意吗?”然后再追问几个后续问题。它们当然有关联,因为你对工作、工具、事情本身、团队越满意,整体幸福感通常也会越高。我以前还会开玩笑……你知道那种广告吗,类似“快乐的牛才会产出快乐的奶酪”?

Lenny Rachitsky不知道。

Nicole Forsgren我记得是 Calabrian 的。那是最经典的一句。快乐的开发者会写出更好的代码。代码写得更好,活也做得更好,还是更好的团队成员和协作者。但如果你要直接去捕捉、或者试图直接影响 happiness,那不是我们该做的事。太难了,也太包罗万象了。Satisfaction 还能给我们一些信号。

Lenny Rachitsky换个完全不同的方向,单说你看到大家在用的工具,有没有哪个会让你觉得,“哦,这个真的特别常见、特别好用”?就是大家用得很多、成功率也很高的那种。比如 Copilot、Cursor 这些常见工具。有没有什么特别想分享的,像“嘿,这个工具你们可以去看看,大家都挺喜欢”?

Nicole Forsgren我觉得它们都很大啊,对吧。Copilot、Cursor、Gemini。

Lenny RachitskyClaude Code。

Nicole Forsgren对,Claude Code。我很喜欢 Claude Code。

Lenny Rachitsky我这边马上就会发一篇长文,讲怎么把 Claude Code 用在非工程场景里。

Nicole Forsgren酷,挺好。

Lenny Rachitsky真的很有意思。比如 Claude Code,“帮我找找笔记本电脑里怎么清理存储空间”,它就会直接告诉你一堆文件。它就像在你电脑上跑的 ChatGPT,你可以让它在电脑上给你做各种疯狂的事,简直像个小上帝。

Nicole Forsgren我现在就要去试试。这个太好了。

Lenny Rachitsky真的很好用。对,所以我才会写这篇。我之前播客请过 Dan Shipper,他说 Claude Code 是市面上最被低估的 AI 工具,因为很多人根本不知道它有多能干。它不只是用来写代码的,这也是我现在越来越想挖的方向。好,还有没有别的东西,你觉得能帮大家提升 developer experience,帮大家适应这个 AI + 工程的新世界,而我们还没聊到的?

Nicole Forsgren我觉得有个很重要的总原则,就是给任何 DevEx 改进都带上产品思维,顺手连我们收集和记录的指标也一起带上。我的意思是,我们要先识别问题,确保自己解决的是一群用户的真实问题。我们要想到 MVP、想到实验、要快速拿反馈、快速迭代。我们要有策略,要知道我们的可触达市场是谁,要知道成功是什么样。基本上要有一套 go-to-market 的思路。我们需要沟通,需要持续从用户那里拿反馈,我们要不断改进。再往后,我们也要想清楚什么时候该把某个东西停掉。它是在维护模式里,还是该下线了?

我觉得这点一般就很重要,但现在尤其重要,因为我们用着 AI 工具,也在把 AI 嵌进产品里,变化太快了。这时候停下来半拍,问一句:“我到底在解决什么问题?”就特别关键。比如,我们过去 10 年一直在用的那个指标,现在还重要吗?还是说它应该被下线,因为它已经没那么重要了?它并没有驱动我现在真正需要的那些决策和行动。

Lenny Rachitsky在我们进入那个很让人兴奋的 lightning round 之前,我想先带你进一个叫 AI Corner 的环节,这是这个播客里的固定栏目。你有没有发现生活里、工作里,某个 AI 工具的用法特别有意思,想分享给大家,也许别人会觉得有用?

Nicole Forsgren我最近一直在做家装和房间重新布置之类的事。我是在跟设计师合作,因为我知道自己喜欢什么,但我不知道怎么走到那里,我在这方面不太行。但我真的很爱用 ChatGPT 和 Gemini 来帮我渲染图片。比如我可以给它户型图,给它一张房间现状的照片,哪怕那张照片和理想效果完全不像;然后我再给它几张不同风格的参考图,告诉它把墙换一下、把家具布局改一下、或者把别的东西改一改。它能帮我,而且速度挺快。它帮我更好地把想法可视化。还是那句话,我知道我喜欢什么,但我不知道怎么到那里,所以我知道它对不对,这个用法可能挺随机的,但现在玩起来很有意思。

Lenny Rachitsky我太太也在这么干。她会一直发我:“这是这块地毯放到我们客厅里会是什么样子。”“这是这个水景放进去会怎么样。”效果真的很好,而且越来越好。你会惊讶地发现,“哇,这就是我们家,只不过换了条新地毯”,你只要上传两张照片,然后问一句,“这东西放我们房间里会怎么样?”

Nicole Forsgren对,我已经被惊艳过好几次了。机器肯定是在偷听我们。它给我生成房间效果图的时候,还会顺手塞一个狗床,因为我家有狗。我就想,“我可没让你加这个,不过对,这个房间确实应该配这个颜色和风格的狗床。”

Lenny Rachitsky说到这个,你试过这个用法吗?直接问 ChatGPT:“根据你对我的全部了解,给我生成一张你觉得我家长什么样的图。”

Nicole Forsgren还没有。

Lenny Rachitsky因为它有记忆功能,会记住你聊过的所有东西,结果特别搞笑。你一定要试试。

Nicole Forsgren好,这个我记到待办里了。

Lenny Rachitsky来吧,额外用法。Nicole,接下来我们进入很激动人心的 lightning round。我有 5 个问题,准备好了吗?

Nicole Forsgren太好了,走起。

Lenny Rachitsky你最常推荐给别人的两三本书是什么?

Nicole ForsgrenPeter Attia 的《Outlive》非常棒。还有一本我觉得也算相关,我之前伤了背,不太爽,但 Stuart McGill 的《Back Mechanic》很厉害。想给有腰背伤的人打个招呼。这本书是给普通人看的,读完你就能自己想办法处理下背痛。它有点随机,但我确实很喜欢《How Big Things Get Done》。作者名字我不会念。我觉得其中一个像是……有斯堪的纳维亚背景的。它会拆解近现代历史里一些特别大的项目,看看它们是怎么失败的、为什么失败。我觉得这对我们很有意思,尤其是在现在这个 AI 时刻,基本上所有软件系统都会变。所以我们该怎么面对一个本质上会变成超大型项目的东西?然后,抱歉,我还想额外加一本《The Undoing Project》,Michael Lewis 写的。Matt Velloso 推荐给我的,真的很好看。

Lenny Rachitsky对,我读过那本——

Nicole Forsgren我看到最后一句的时候,真的忍不住“哇”了一声。

Lenny Rachitsky啊?我当时就想,“什么?”

Nicole Forsgren我当时是 [听不清 01:03:48]。对,我完全没想到。

Lenny Rachitsky我读的时候根本不记得最后那句了。天哪。好,下一题。你最近看过、而且很喜欢的一部电影或者电视剧是什么?

Nicole Forsgren我会说我看《Love Is Blind》。如果一天结束后要放空一下,这节目挺好玩的。

Lenny Rachitsky新一季出来了。

Nicole Forsgren对,很期待……还有《Shrinking》。你看过《Shrinking》吗?

Lenny Rachitsky没有。我好像看过《The Therapist》,嗯,我试过。

Nicole Forsgren强烈推荐,挺可爱的。

Lenny Rachitsky好,下一个。你最近发现、而且特别喜欢的一样产品是什么?可以是 App,也可以是厨房小家电,或者衣服什么的。

Nicole Forsgren对,Ninja Creami 真不错——

Lenny Rachitsky你上次是不是也说过这个?

Nicole Forsgren我不知道。可能说过吧,但我觉得没有。

English No English text found
No English transcript text was found for this chapter.
章节 08 / 08

第08节

中文 译稿已完成

Lenny Rachitsky有人跟我说过这事,我到现在还记得。就是那种——

Nicole Forsgren真的很好用。

Lenny Rachitsky——你会用它做冰淇淋什么的,对吧?

Nicole Forsgren对,你基本上可以把蛋白奶昔冻起来,然后它就会变成冰淇淋——

Lenny Rachitsky天哪。

Nicole Forsgren——而且还挺好吃的。还有一个是 Jura 的咖啡机。我很爱喝好咖啡,但我其实不太会做,所以我只要按个按钮,它就能给我任何我想要的东西,包括拿铁、卡布奇诺什么的。所以这东西也挺有意思的。

Lenny Rachitsky不错,好。你有没有最喜欢的——

Nicole Forsgren就是糖和咖啡因。我只需要靠这个撑过一天。

Lenny Rachitsky这就是工程生产力 101 啊。

Nicole Forsgren对。

Lenny Rachitsky天哪。好,还有两个问题。你有没有一句特别喜欢的人生格言,平时在工作或生活里经常用得上,也会反复想起?

Nicole Forsgren有,我觉得有一句我前前后后提过几次。它不算逐字逐句的原话,更像是一种感觉。就是 hindsight is 2020,回头看都很清楚,但这话也挺蠢的。我觉得如果我们在当时拿着手头已有的信息,已经做出了能做的最好决定,那就只能这样了。如果你明知道更好的信息,却还是做了一个坏决定,那当然不太行。我觉得我们对自己和对别人都不够宽容,因为我们总是会在后面才发现更多信息。

Lenny Rachitsky说得对。最后一个问题。本来我还想问别的,但我们在准备这期的时候,你提到你在 Google 有了一个新职位。你不妨聊聊那边在做什么、为什么去 Google、以及大家应该知道些什么。

Nicole Forsgren当然。我现在是 developer intelligence 和 core developer 的高级总监。这个岗位特别让人兴奋,也特别有意思,因为正好和我们刚才一直聊的这些事都对上了。它聚焦的是 Google 及其各项产品,还有底层基础设施。我们怎么提升 developer experience、developer productivity、velocity,也就是我们刚才聊的这些东西。而且因为我本身算是比较偏数据的人,我们还会思考要怎么衡量这些东西,衡量方式怎么变,反馈循环怎么变,怎么把整个体验做得更好,然后再把这种变化通过组织真正推下去,而且要推得有意义、有影响力,也比以前更快。

Lenny RachitskyGoogle 这波招到 Nicole,真是赚到了。我得赶紧再买点 Google 股票。好,接着两个追问。大家想继续深挖的话,可以在哪里在线找到你、在哪里找到你的书?听众又怎么帮到你?

Nicole Forsgren网上的话,书可以在 developerexperiencebook.com 找到,我本人在 nicolefv.com,LinkedIn 也偶尔会出现。不过那里有时候挺乱的,我还是尽量在一堆噪音里捞信息。总之我去那里是为了做点有用的事,也欢迎大家去订阅这本书和配套 workbook。workbook 是免费的。我也很想听大家的反馈,比如哪些地方有用、哪些地方没用。我一直很喜欢听这类故事。

Lenny RachitskyNicole,非常感谢你今天来。

Nicole Forsgren谢谢你邀请我,Lenny。

Lenny Rachitsky我也很开心。再次感谢。拜拜,大家。

非常感谢收听。如果你觉得这期有价值,欢迎在 Apple Podcasts、Spotify 或你常用的播客 App 里订阅这档节目。也请考虑给我们打个分或者留个评论,这会很有助于其他听众找到这档播客。你也可以去 lennyspodcast.com 查看往期节目或了解更多信息。我们下期再见。

English No English text found
No English transcript text was found for this chapter.