Ethan Smith
Transcript
Lenny Rachitsky[00:00:00)]There's this term everyone's hearing about,
AEO. Ethan Smith[00:00:02)]Answer Engine Optimization is how do I show up in LLMs as an answer?
It feels like such a big deal to win at AEO. Ethan Smith[00:00:09)]In order to win something like what's the best website builder? At Google,
they would win if their blue link showed up first.[00:00:15)]But that's not the case in the LLM, because the LLM is summarizing many citations,
ChatGPT is driving more traffic to my newsletter than Twitter. Ethan Smith[00:00:25)]You can get mentioned by a citation tomorrow and start showing up immediately. You can have a Reddit thread,
you can have a YouTube video.[00:00:31)]You can be mentioned on a blog. So early-stage companies can win,
they can win quickly. Lenny Rachitsky[00:00:36)]Are the leads that these answer engines are driving to companies actually valuable?
Ethan Smith[00:00:41)]Significantly more valuable. Webflow saw a 6
Google's slice of the pie stays the same. The pie gets bigger. Lenny Rachitsky[00:01:05)]Today my guest is Ethan Smith. Ethan is the CEO of Graphite and my go-to expert for all things SEO. SEO is going through a major transition right now. Everyone used to go to Google anytime they had a question, or were looking for a product or doing research. These days, a lot of people are moving to ChatGPT and Claude, and Gemini and Perplexity to get answers to their questions,
and this will only be accelerating over time.[00:01:29)]And even Google is changing the search experience in a pretty radical way with AI Overviews at the top, and their newly introduced AI Mode, which is basically their own version of ChatGPT. This means that the world of SEO is going through a big change, including the rise of AEO, which stands for Answer Engine Optimization. Basically, SEO for ChatGPT,
getting your product to show up in the answers that people get.[00:01:51)]Ethan has been at the forefront of this new skill and channel. And in this conversation,
he shares everything that he's learned about how to get your product to show up more often inside of the answers that people get. The advice that Ethan shares in this conversation is incredibly tactical and worth a lot of money. So please slurp it up and use it for your own products.[00:02:10)]If you enjoy this podcast, don't forget to subscribe and follow it in your favorite podcasting app or YouTube, it helps tremendously. And if you become an annual subscriber of my newsletter, you get a year free of 15 incredible products, including Lovable, Replit, Bolt, n8n, Linear, Superhuman, Descript, Wispr Flow, Gamma, Perplexity, Warp, Granola, Magic Patterns, Raycast,
ChatPRD and Mobbin.[00:02:29)]Check it out at lennysnewsletter.com and click product pass. With that, I bring you Ethan Smith. This episode is brought to you by Orkes,
the company behind Open Source Conductor. The orchestration platform powering modern enterprise apps and agentic workflows. Legacy automation tools can't keep pace.[00:02:47)]Siloed, low-code platforms, outdated process management and disconnected API tooling falls short in today's event-driven, AI-powered agentic landscape, Orkes changes this. With Orkes Conductor, you gain an agentic orchestration layer that seamlessly connects humans, AI agents, APIs, microservices,
and data pipelines in real time at enterprise scale.[00:03:07)]Visual and code-first development, built-in compliance, observability and rock-solid reliability ensure workflows evolve dynamically with your needs. It's not just about automating tasks, it's orchestrating autonomous agents in complex workflows to deliver smarter outcomes faster. Whether modernizing legacy systems or scaling next-gen, AI-driven apps,
Orkes accelerates your journey from idea to production.[00:03:31)]Learn more and start building at Orkes.io/Lenny. That's O-R-K-E-S.I-O/Lenny. My podcast guest and I love talking about craft and taste, and agency and product market fit. You know what we don't love talking about? SOC 2. That's where Vanta comes in. Vanta helps companies of all sizes get compliant fast and stay that way, with industry-leading AI,
automation and continuous monitoring.[00:03:56)]Whether you're a startup tackling your first SOC 2 or ISO 27001, or an enterprise managing vendor risk, Vanta's trust management platform makes it quicker, easier, and more scalable. Vanta also helps you complete security questionnaires up to five times faster so that you can win bigger deals sooner. The result? (00:04:14): According to a recent IDC study, Vanta customers slashed over $500,000 a year and are three times more productive. Establishing trust isn't optional, Vanta makes it automatic. Get $1,000 off at Vanta.com/Lenny. Ethan,
Excited to be back. Lenny Rachitsky[00:04:42)]We did a podcast episode just over two and a half years ago. I think of it as the definitive guide on how to win at SEO. People have been referencing it ever since. I'm really proud of what we did there,
but things have changed.[00:04:54)]Things are changing in the world of SEO. And so I'm excited to talk to you again about how to be successful in this new-emerging world where AI is changing how SEO works,
the rise of AEO and GEO.[00:05:08)]Let me start with just this question. How long have you been working on SEO at this point? And has anything come close to being this significant in changing the skill of SEO?
Ethan Smith[00:05:19)]Yes. So I got started in SEO in 2007, so it's been 18 years. Actually, the largest change when I got started in SEO, I got started in programmatic SEO and commerce SEO, like NexTag and Shopping.com and PriceGrabber. And that was when you could do mass,
auto-generated landing pages.[00:05:41)]And that was probably the biggest shift, which is Google introduced a bunch of algorithms, Panda and similar things, to prevent you from doing spam. So essentially, you went from the SEO being spam to not spam. That was probably the biggest change,
and then this is probably the second-biggest change.[00:05:56)]I think that the main thing here is it is related to search,
but it's a summarization of search and there's new inputs. So it's probably the second-biggest change. Lenny Rachitsky[00:06:03)]Okay, that is really interesting,
because I think a lot of people are seeing this as everything is different. Nothing we've done before is going to work.[00:06:10)]We have to rethink everything. You're saying this is actually the second-biggest change, and just like Google's update back in the day was actually even more significant?
Yep. Lenny Rachitsky[00:06:18)]Very cool. Okay,
let's set a little context for folks. Let's define some terms. There's this term everyone's hearing about.[00:06:25)]There's actually two, AEO and GEO. What do they stand for? Are they different? What are they referring to specifically?
Ethan Smith[00:06:33)]They, I think, are the same. Ultimately, the definition of a word is whatever a group of people agree is the definition of a word. So I think we'll see what people decide is the definition of the word. I'll put forward my definition. So AEO and GEO are essentially trying to describe the same thing, which is how do I show up in LLMs as an answer? (00:06:52): And I personally prefer Answer Engine Optimization versus Generative Engine Optimization, because generative, you can generate images and videos and things other than an answer. Whereas answer is more narrowly defined,
so my personal preference is we're talking about optimizing LLMs.[00:07:09)]So an answer is more narrow of a definition than generative, but ultimately,
it's whatever we decide is the name and the definition is what it will be. Lenny Rachitsky[00:07:19)]Okay. Yeah,
yeah. Answer Engine Optimization sounds a lot cleaner to me if you had to pick one. So it's good to know they're the same thing. Some people just prefer the latter one for some reason.[00:07:28)]It's interesting because recently, I don't know if I told you this, but I was looking at my referral traffic. And I found that ChatGPT is driving more traffic to my newsletter than Twitter,
So somehow it's already happening. I'm excited to learn just how to lean into that potentially and optimize it further. Ethan Smith[00:07:47)]And when did you see the spike? Did you see when it started growing dramatically?
Lenny Rachitsky[00:07:50)]Unfortunately, the dashboard I have doesn't give me great peripheral traffic optimization. When do you think I probably saw it?
Ethan Smith[00:07:57)]Companies that we work with started in January and it started, one, because of more adoption,
but two is because the answers became a bit more clickable.[00:08:05)]You have maps, you have shopping carousels, you have clickable cards. So I think the clickability of the answer is increased,
and then the adoption increased and that was around January. Lenny Rachitsky[00:08:14)]Okay. I want to come back to this question of, "Is this good that ChatGPT is sucking all my content and giving people answers, and then sending me some percentage of that?"
But let's not get into that yet. I want to talk about just what kind of impact you can have on having your stuff show up in ChatGPT.[00:08:32)]So I had the head of ChatGPT, Nick Turley, on the podcast recently. I asked him, "What do you think of all this stuff, AEO, GEO?" He's like, "Don't worry about any of that. Just write awesome stuff, great quality content. It'll figure it out. It'll find the best stuff."
I imagine you very much disagree.[00:08:47)]I imagine you have seen real impact getting your stuff proactively into these answer engines. Talk about just the kind of impact you've seen and just your reaction to that?
Ethan Smith[00:08:56)]Yeah. I agree and disagree, but the way that I think about it is anything can be optimized. You just need to understand the underlying systems and the rules of the game, and if you do that, then you can optimize anything. You can optimize algorithms, you can optimize people,
anything could be optimized.[00:09:11)]What I think he probably meant by that, he probably meant two things. One is, "Please don't spam my product." And two is, "If you do, I will see it and I will stop you from doing that." So it's not a long-term, robust strategy to create spam, just like it wasn't a long-term,
robust strategy to create spam on Google.[00:09:29)]Eventually, Google was going to say, "Huge shopping comparison sites are making 100 million auto-generated search pages and I don't like it, and I'm going to get rid of the whole category." So same thing with ChatGPT, anything can be optimized, but if you're spamming it,
And they'll have a whole team looking at that and then they'll change your algorithm to prevent you from doing that. Lenny Rachitsky[00:09:48)]What kind of impact have you seen? You've done work with a lot of companies,
we'll talk through a few examples.[00:09:53)]Maybe share one to give us context just like how much can you impact this sort of thing where you show up in, say, ChatGPT more often?
Ethan Smith[00:10:01)]You can affect it a lot. So a specific example with Webflow is we are working with Webflow on their SEO. We're working on their content,
and we're seeing a lot of wins on the Answer Engine Optimization side.[00:10:16)]So the specific things that we've done there, one is just traditional SEO. So make landing pages for high-search volume keywords,
like best no-code website designer.[00:10:28)]And then for free, you'll get Answer Engine Optimization impact from that. So that's just traditional SEO,
which works very well for AEO. Lenny Rachitsky[00:10:35)]I was just going to say,
that sounds exactly the same as regular SEO. Ethan Smith[00:10:38)]Yeah. So I would say everything that works in SEO works in AEO, but there are additional things beyond SEO that also work in AEO. So second thing, and the way that I think about AEO versus SEO is that the head and the tail are different. So the head is different in that in order to win something like what's the best website builder? (00:10:59): Even if Webflow's URL shows up number one on the citations, they're not going to win the answer because their URL showed up number one, but at Google they would win. If their blue link showed up first they would win,
but that's not the case in the LLM. Because the LLM is summarizing many citations and so you need to get mentioned as many times as possible.[00:11:18)]So usually when you ask something like, "What's the best tool for X?" The first answer will be mentioned the most in the citations, because that's very different from Google. And so for Webflow, we work with them on YouTube videos, Vimeo videos, getting mentioned in Reddit, getting mentioned in other blogs, affiliates,
stuff like that.[00:11:39)]So tried a bunch of stuff. Stuff that worked especially well was just straight SEO, number one. Number two is YouTube videos,
and then the third is Reddit optimization. Lenny Rachitsky[00:11:47)]Okay, wow. So you're saying if you can get to number one, when you ask ChatGPT, "What's the best website builder?" (00:11:54): And Webflow's at the top, that doesn't actually drive them as much traffic as simply being mentioned most often across the summary?
Ethan Smith[00:12:01)]Yes. And part of why that's interesting is because when startups come to me and ask me for SEO help, my first response is, "Don't do it at all. Spend your time on something else because you're not going to be able to grow SEO early on in search." Because you don't have enough domain authority and it takes a while to get domain authority,
and only once you have domain authority can you rank.[00:12:23)]And so for Google, it's usually something that you do Series A, Series B or later. You don't do it as soon as you start because can't win early on. That's not the case for Answer Engine Optimization, because you can get mentioned by a citation tomorrow and start showing up immediately. You can have a Reddit thread,
you can have a YouTube video.[00:12:42)]You can be mentioned on a blog, like a brand-new YC company launches, everyone's talking about them. They could show up in an answer tomorrow as a result of that. So early-stage companies can win, they can win quickly. And they can win quickly and anyone can win quickly,
by getting mentioned as many times as possible by the citations. So that's what's different about the head.[00:13:04)]What's different about the tail is that the tail is larger in chat than in search. So the average number of words, I think, Perplexity said this to somebody else, said that it was around 25 words, where versus Google words it's around six words. So the tail is just much,
much larger. People are asking lots of follow-up questions. Lenny Rachitsky[00:13:19)]The tail, the prompt essentially, the question you're asking?
Ethan Smith[00:13:23)]Yes. Meaning that if you map out all of the questions that people ask, kind of like an SEO, long-tail keywords, if you do long-tail questions, the size of the tail is larger. Meaning the amount of questions that are very specific is larger,
the share and the volume.[00:13:42)]And there's probably questions that have never been asked before and questions that have never been searched before, because search can't support lots of really specific,
super-specific stuff. Whereas chat is specifically made to ask a bunch of follow-up questions and have a conversation.[00:13:58)]And so there's all these questions that have never been asked or searched for before that are now being asked, and then you can win that. And when I got started in SEO, it was long-tail SEO where you have a page for every single keyword, which doesn't work anymore,
but now the long tail is back in chat.[00:14:13)]And if you know all those really specific questions that people are asking, you can also win that, and you can probably also win that early. And I've seen examples of early-stage companies who just launched some really specific AI-enabled payment processing API thing,
and they will show up. And they'll show up because they're answering questions that's never been answered before. Lenny Rachitsky[00:14:35)]Are the leads that these answer engines are driving to companies actually valuable? Are these good-quality leads for B2B SaaS especially?
Ethan Smith[00:14:44)]They are significantly more valuable. So Webflow, we saw a 6
X conversion rate difference between LLM traffic and Google Search traffic. Lenny Rachitsky[00:14:55)]Six times?
Ethan Smith[00:14:56)]Six times so significantly more qualified. I think that's probably for a couple of reasons. Probably it's because you're so primed because you're having a conversation with multiple follow-ups,
and so there's so much intent that you've built.[00:15:08)]And you've probably really narrowed in on what you want, so when you're going somewhere,
it's probably highly qualified. And so we're seeing that it's just a much higher conversion rate. Lenny Rachitsky[00:15:18)]Wow, this is so interesting, and it makes sense. People trust ChatGPT to tell them the answer, and if you are the answer, you have so much advantage. Like that is what people want to know, and then, "Okay. Cool, thank you. I'm going to go check this out." (00:15:35): This all just makes sense. Going back to the three levers you shared, essentially it's the things that you see work in driving you showing up more in these answer engines, landing pages, YouTube videos and Reddit. Is that right?
Okay. Ethan Smith[00:15:51)]The other things, so I would break it up into stuff on your site, onsite and offsite. So onsite would be traditional SEO. The difference would be this long tail. I would also say that the difference is lots of follow-up questions about does your product do this thing? What are the use cases, features, integrations, languages? (00:16:07): Tell me about your product and really specific details about that and that's on your site. And then the second group would be offsite, which is show up in all the citations. Citations are comprised of video, UGC like Reddit and Quora,
affiliates.[00:16:23)]Dotdash Meredith is showing up all over the place, Glamour, Good Housekeeping, it's like getting mentioned there, blogs,
so it's those two groups. Lenny Rachitsky[00:16:33)]And that all sounds very similar to SEO showing up on other people's pages. Showing links from, say,
Reddit is always great.[00:16:41)]It's interesting that Reddit is such a big deal. What's going on there do you think?
Ethan Smith[00:16:45)]Okay, Reddit is one of the most interesting things. It's hugely cited in LLMs. And it's probably the number one thing people are asking, customers are asking me is, "How do we optimize for Reddit?" And this goes back to the head of ChatGPT's question about, "Please don't spam my product." (00:17:06): And so Reddit is a community where it's real opinions from people, authentic, and it's heavily managed by the community and the community is very good at managing it. And so the obvious strategy for a growth person is, "Let's make a bunch of automated spam and spam Reddit all over the place and get my product to show up everywhere." (00:17:27): That's the growth mindset, which makes sense, the hustle mindset. So what are people looking at? They're looking at creating hundreds of fake Reddit accounts pretending to be someone that you're not. I have a single person, I'm going to make 100
Reddit accounts. I'm going to autopost comments and then like my own comments.[00:17:46)]And then build a trust score, and then shout, say everywhere that my product is the best product. Fortunately, that doesn't work very well, but that's the obvious strategy. And so we're seeing people trying to do that and then we're also seeing those accounts get banned, those comments get deleted. And so we're seeing people trying to spam and being unsuccessful,
so that's one strategy.[00:18:05)]The other strategy is the whole purpose of Reddit is to post useful, high-quality, authentic comments from real people. So at Webflow, we have a couple of people at Webflow going to comments and saying, "This is my name, this is where I work, and here's a useful piece of information."
So the strategy is find a thread that is a part of a citation that you want to show up in.[00:18:30)]Say who you are, say where you work, and then give a useful piece of information, and that works really well. And that sounds simple if you're not in the growth mindset of, "I need to scale this to hundreds of comments." But you don't actually need 10,000 comments,
even five could be great and that scales perfectly well.[00:18:47)]So the Reddit strategy is the obvious strategy, which is just to be an actual user of Reddit. Make an account, say who you are, say where you work,
and give a useful answer. Lenny Rachitsky[00:18:56)]We had the early-growth leader from Deel, D-E-E-L, on the podcast a while ago. And this is how they grew initially, before AI even came around,
just going big on Reddit and answering people's questions.[00:19:07)]And like, "Hey, happens to be Deel. Can I help you with this problem?" So that's interesting. It's so interesting that Reddit is what is keeping ChatGPT from being spammed with stuff. It's not that ChatGPT is stopping the spam,
its Reddit is just really good at that. Ethan Smith[00:19:23)]I think that in a sense, ChatGPT is policing because ChatGPT is running a search,
it's finding citations. There's a search algorithm that's trying to select which citations are useful. There are people at ChatGPT who are tuning their search algorithm to select which sources they trust.[00:19:40)]I'm sure that there's a search evaluation team saying, "Do I like these citations, yes, no? Is Reddit showing up? I want it to show up." So I think that there are actual people at ChatGPT who are intentionally configuring their algorithm to use Reddit because it's trusted. And if it wasn't trusted,
they wouldn't use it.[00:19:57)]Same with Google. Google has specifically configured their search algorithm to rank Reddit and Twitter and Quora, because they want user-generated content. And if it wasn't good content,
then they would change the algorithm and they wouldn't rank it. So I think that they are policing it in a sense. Lenny Rachitsky[00:20:13)]Got it. And all of this is post-training, search-oriented features of these models. It's not data they are trained on, is that right?
Ethan Smith[00:20:24)]I would assume that so there's the core model and then there's RAG. So the core model is I'm looking at common crawl on billions of web pages, and then I'm retraining the model. And if you ask something like, "What's the capital of California?" It predicts the next word, which is Sacramento. And that's based on the core algorithm,
which is next-word prediction.[00:20:44)]Then there's RAG and RAG basically means search, retrieval-augmented generation. So I'm going to do a search and then I'm going to summarize the search. There are these two different things. And so most of what I'm describing is about the RAG piece,
not the core model piece. To influence the core model is probably extremely hard and maybe you'll see the impact a year later.[00:21:03)]And it's probably something, some sort of obscure thing that nobody would want to do, like make a million pages that say, "Best product for X is brand." Which I don't think most people want to spend their time on. So I'm mostly focused on the RAG side,
And I think also the LLM is probably not going to say your product if it didn't show up anywhere on the RAG. So I think that's where most of the interesting stuff is from an optimization perspective. Lenny Rachitsky[00:21:27)]Cool. Yeah. I didn't even think about this side of it when we started talking about this, but I think that's an important thing to note,
is just this has nothing to do with the training data.[00:21:34)]This is post-training, once the model's live, what it can do to find recent information using RAG, web search, things like that. Okay. Before we get into how to actually do this step-by-step,
how to win at AEO.[00:21:48)]What are two or three things that you think are important for people to understand to be successful in this world just broadly?
Ethan Smith[00:21:54)]First thing is just recognizing that this is related to search. So it's LLM plus RAG, it's summarizing a set of search results usually. So LLM plus RAG, number one. Number two is topics. So in search, a landing page is targeting hundreds of keywords,
which we talked about on the last podcast.[00:22:12)]So I'm not targeting one keyword like I was in 2007, I'm targeting 1,000 keywords, and each landing page needs to target that set of 1,000 keywords, and that's a topic. Same thing is true for Answer Engine Optimization. Each page is targeting hundreds, thousands,
maybe tens of thousands of questions.[00:22:28)]And so I want to group all those questions, which then brings us into content, so how would I rank? How would I get my URL to rank? Or how are other URLs being decided whether or not they rank? Then answer all the questions. The more of the questions that I answer,
the better.[00:22:42)]So in Google Search, if I have a landing page about website builders, the more that my page answers all of the subtopic, follow-up questions, the more likely I am to show up in Google Search. Same with chat, the more you answer all the questions, the better. If you don't answer a question,
then you're probably not going to show up.[00:22:57)]And if you answer a follow-up question and subtopic somebody else is not answering, you're going to be more likely to show up. So topics, number two. The third is question research, so how do I know which questions people are asking? And that's actually pretty hard, because in search,
Google just tells you what their ads API.[00:23:15)]They say, "This is the search volume for this keyword." There's a truth set from Google and ChatGPT is not giving us that, at least not yet. Maybe when they do ads, they'll give us more access to search volume, but there's no truth set. So how do we know the questions that people are asking? (00:23:31): One way would just be to take all my search terms and change them into questions. So website builder, you can assume that what's the best website builder is probably a question that's probably asked proportional to the search volume for that keyword,
so that's one.[00:23:44)]But then I mentioned that the tail is larger, and there's parts of the tail that don't exist in search. So how do we know what the tail looks like? And one strategy that you can use, is what are all the questions people are asking you on your sales calls, customer support on Reddit? (00:24:02): Mine all those questions that exist somewhere else. Probably those same questions are being asked in chat, so that's another way to find questions. The last is citation optimization or offsite. So again, the LLM is summarizing RAG. So how do we show up with as many citations as possible? (00:24:21): And you can break up the citations into different groups, my site, video, YouTube, Vimeo, UGC, Quora, Reddit. Tier-one affiliates like Dotdash, tier-two affiliates,
blogs. So it's breaking up all those different citations and having specific strategies for each group. Lenny Rachitsky[00:24:40)]What is Dotdash exactly?
Ethan Smith[00:24:42)]Dotdash Meredith is a large media conglomerate with Good Housekeeping, Allrecipes,
Investopedia. It's probably the most successful SEO company of all time.[00:24:53)]And it's also one of the most cited,
probably the most cited in LLMs as well. Lenny Rachitsky[00:24:59)]Wow, did not know this. As you talk, I think about if you go to Google, no offense, Mr. SEO, but if you go to Google these days, it's just like a bunch of unuseful stuff,
just like this hyper SEO'ed content.[00:25:12)]Do you think ChatGPT will be able to avoid that fate where it's just a bunch of hyper SEO'ed content that is not what you actually want?
Ethan Smith[00:25:19)]Probably. And what you're saying with SEO is that everyone's rewriting each other's content, nonexperts rewriting each other's content. So I get a content-scoring tool, which then looks at all the results in Google and it says, "These are all the things that the other articles are saying. And then this is what you haven't said yet, so here are recommendations for how to be more typical." (00:25:40): And then everyone rewrites each other's article. And then one other interesting thing is that the majority of landing pages drive no impact. So we did an analysis where one out of 20 landing pages drive roughly 85% of all your traffic. So 19 out of 20 landing pages drive little to no traffic, which means if I want to get ROI,
I need to spend a small amount of money on a large number of pages.[00:26:06)]And so then you get a nonexpert to say, "Rewrite this other person's article," because that's cheaper than hiring someone from The New York Times to write your article about what's the best payroll management software? But if you knew the few things that would work, the few landing pages that would work, and you wrote them really well, then you could push all that money to that one page,
which is what we try to do.[00:26:28)]But right now it's people rewriting each other's content, so Google has not solved that yet. That's probably a very hard problem to solve. Will they ever solve that? Probably. Will ChatGPT ever solve that? Probably. How I would solve that would be, one concept would be information gain. So did you say something that somebody else didn't say? Two is how typical are you? (00:26:50): Are you so typical that I think that you're a rewritten version of somebody else's content? Potentially, Google has EEAT, expertise, authority, trustworthiness, which actually I don't see having an effect unfortunately, but it could. And I could say, "Well, this person's an expert, this person's a certified financial advisor, rank them higher." (00:27:08): And I'm actually not seeing that, but they could increase the weight of that. So these are all potential solutions, but I'm sure that the reason why it has not been solved yet and why everyone's rewriting each other's articles. It's probably just hard to build an algorithm to solve that, but will they ever solve that?
Probably. Lenny Rachitsky[00:27:23)]This algorithm or heuristic you just shared is so interesting, because it's helpful for just what is good content, say, with a newsletter or a podcast? Info gain, and is it typical? (00:27:34): Are you adding something new to the conversation and is this unique?
I think it's a really good strategy for just producing great newsletters and podcasts and all the content in the world. Ethan Smith[00:27:45)]Yes. And ideally, did you do original research and do you have some domain expertise? And did you mention that in the content?
Lenny Rachitsky[00:27:51)]This is a great heuristic for just content in general, which is exactly what you want these algorithms to be looking for,
so the alignment is there.[00:27:59)]This episode is brought to you by Great Question, the all-in-one UX research platform loved by teams at Brex, Canva, Intuit and more. One of the most common things I hear from PMs and founders that I talk to is, "I know I should be speaking to customers more, but I just don't have the time or the tools."
That's exactly the gap Great Question fills.[00:28:18)]Great Question makes it easy for anyone on your team, not just researchers, to recruit participants, run interviews, send surveys, test prototypes, and then share it all with powerful video clips. It's everything you need to put your customers at the center of your product decisions. With a prompt as simple as, "Why did users choose us over competitors?" (00:28:35): Great Question not only reveals what your customers have already shared, but it also makes it incredibly easy to ask them in the moment for fresh insights from the right segment. Picture this, your roadmap's clear, your team's aligned, you're shipping with confidence,
and you're building exactly what your customers need. Head to greatquestion.com/Lenny to get started.[00:28:56)]Let's give people an actual, actionable plan to start executing on this and winning essentially at AEO. If it's helpful to use my newsletter as an example, how would I show up more often on ChatGPT or Gemini or whatever? Or if it's a B2B SaaS company, whatever's easiest,
let's just talk about how to actually do this. Ethan Smith[00:29:12)]First, I would figure out which questions I want to rank for. How I would figure out which questions I want to rank for, I would take my search data. I would maybe take my paid search data, like, "What are my money terms? What are my competitors' money terms?" So if I'm rippling, what is deal.com bidding all their paid search on? (00:29:30): Then I would transform those into questions. And actually you can just give those keywords to ChatGPT and say, "Make these into questions," and it does a pretty good job. So take your competitors' paid search data or mine or your own, put it in ChatGPT, get the questions. That's step one. Step two is then track them, so put them in an AEO tracker,
in an answer tracker.[00:29:51)]Third thing would be who is showing up as citations? And then have a strategy for each of those different groups of citations. The third would be make your own landing pages. So what are the kinds of landing pages that are appearing? Is it a listicle? Is it a category page? Is it an article, tool page? Figure out what page type that seem to be showing up the most,
and then you make your own page for that.[00:30:14)]How do you have your page rank? Answer all the follow-up questions. So what are all the follow-up questions that someone might ask? You could go back to your search data and look for groups and themes of your keywords that are in your SEO topic. Same thing for AEO topic. Then on the offsite,
so different strategies for each of those groups.[00:30:37)]And I would say that depending on the company, paying an affiliate to mention you, that's pretty easy if you have the money. So if you want to be the best credit card, you pay Forbes and then you're the best credit card. So that's strategy one, expensive, easy, controllable. The YouTube, Vimeo strategy is also actually pretty easy because there's no community saying, "I don't like your YouTube video." (00:30:59): You make a YouTube video, you do whatever you want. Maybe people view it, maybe they don't, but you can make a YouTube video or a Vimeo video. And the interesting thing with this, especially for B2B, is that YouTube, Vimeo, other video sites, the kinds of things people make videos for are food, traveling, fun,
beauty.[00:31:18)]There's not that many videos about AI-powered payment processing APIs, as interesting as that is, but it's a great money turn. So if you make a video for these really specific, high-LTV, maybe nonglamorous keywords, questions, topics, that's actually a big opportunity. Then Reddit, so I mentioned with Webflow what we did, which is just make a Reddit account, say who you are,
say where you work and give a useful answer.[00:31:48)]That one is a little bit trickier because the community might say, "I don't like your answer." So you can't guarantee that your comment is there, but it is easy, so I would do that group. Oh, and then experiment design,
experiment design and seeing what works. So SEO and AEO are both interesting in that the majority of the information and best practices are not correct.[00:32:13)]And the reason why is because people don't do analysis. Somebody will say something and then it will get repeated, and then it becomes best practice and no one ever did an analysis. So you did all the stuff that I just mentioned. Do an experiment and see if it worked. Maybe half the stuff I said works,
maybe half it doesn't. Do your own experiment.[00:32:32)]Most best practices, most blog posts are not correct. So how do you set up an experiment? You get your questions, you turn tracking on, give it a couple of weeks. Make your changes, have a test group, have a control group. Intervene on the test group, make your changes, see if the chart went up, see if the control group did not,
and now you know your particular strategy worked.[00:32:55)]So I would definitely do experiments and I would not assume that stuff you read online is correct. And then you need a team, so who's your team? Probably your team is your SEO team, or your SEO agency or your SEO consultant. Probably, hopefully they can do this stuff, and then however,
what I think is hard to hire for is the offsite stuff.[00:33:16)]So most SEO people are not going to be amazing at creating YouTube videos and Reddit strategy, so you might need a different person for that. That might be a community generalist marketing person. So it would basically be your SEO team, "Please now do Answer Engine Optimization." And then marketing community team, "Please help me show up in more citations."
Lenny Rachitsky[00:33:35)]Wow, okay. That is incredibly valuable. Thank you for sharing all that. I imagine some of this is you're just giving away a lot of amazing advice for free here. Thank you. First of all, I imagine there's a layer,
there's only so far you can go on your own.[00:33:49)]And so eventually it's like, "Okay, we really need help." And that's where a team like yours comes in. Let me ask a few questions here to follow up. One is this tracker concept. So what is this tracker, it can track how often you show up? Say Lenny's Newsletter shows up and answers for the questions that I'm targeting?
Ethan Smith[00:34:03)]Yeah, so there's answer tracking, which is like keyword tracking. So keyword tracking would be best growth podcast, and you put that in a keyword tracking tool. There's 100 of them, they're all the same, and you see whether or not what you rank. Maybe you rank, hopefully you rank number one. Now in answers it's very different,
but it's related.[00:34:25)]So if you ask the same question, you will have different answers each time. If you ask a question, there's different answers per run. And so ChatGPT is basically calculating a distribution of all the potential answers it would give. And depending on when you ask it, it's basically like a weighted, random sample,
and so you're going to get different answers.[00:34:46)]You also have question variants, so you can ask different versions of the same question, and you might show up in one and you might not show up in another. Then there's different surfaces, there's Perplexity, there's Gemini, there's ChatGPT, there's Meta AI,
and so these surfaces have different answers.[00:35:00)]And so you essentially need to create a share of voice across all these different things like a distribution. So how often am I showing up? What's my average rank? And that's answer tracking. So then where do you get answer tracking? And answer tracking is essentially an evolution of keyword tracking. So we have a page with 60
different answer tracking tools.[00:35:22)]But it's ultimately just like keyword tracking, it's all the same thing roughly. And so pick one of the 60, we have answer tracking, we're building answer tracking. There's 59 other options, probably all pretty good, probably all pretty similar,
but pick one. My general suggestion is pick the cheapest one that does what you need.[00:35:41)]Just like keyword tracking, you can only, there's not a premium version of keyword tracking. You rank number three or you don't. So pick the keyword tracker that is the cheapest that does what you want. Same with the answer tracking. And so then when I'm doing the experiment, put your answers in, track them, see a chart over time,
see your average rank.[00:35:58)]How often are you showing up and what's your average rank? And then you make a change,
and then hopefully you go up. Lenny Rachitsky[00:36:03)]Amazing. I love this term voice share. I never heard that before, it makes sense. Like percentage of time you're showing up in LLMs, is there an LLM, is it just like ChatGPT? (00:36:13): Is Google equivalent now to ChatGPT? How do you recommend people think about, say, Gemini or Claude, or Perplexity and others?
Ethan Smith[00:36:21)]So interestingly, there are similar, foundational algorithms across all of these. They're all using search, they're all using search, and they're all using LLMs, which foundational algorithms are all the same. The results are actually pretty different. So we're doing a study,
we're seeing that Google and Bing are not that similar search engines.[00:36:41)]We're seeing that ChatGPT citations and Google Search results are actually not that similar. Perplexity is interestingly more similar to Google than ChatGPT. We did a study looking at thousands of questions and saw the citation overlap with Google Search results was around 35% for ChatGPT and Google,
so not that much.[00:37:02)]Perplexity was around 70%, but essentially they're all similar algorithms, but with very different citations and results. So then look at which surfaces have the most traffic and then track those. You probably don't need to track all of them,
but look across all those.[00:37:17)]But you do need to look at your share of voice or the percent of time you show up across all these surfaces. You need to ask the question multiple times,
and you need to ask the variance of the question to truly know how frequently you're showing up. Lenny Rachitsky[00:37:28)]Considering that ChatGPT, they're going to hit something like a billion weekly active users in the near future, do you need to worry about Claude and Gemini and Perplexity? (00:37:39): Is the traffic there meaningful? I know it is a lot of people, but how important is it to focus on those other LLMs?
Ethan Smith[00:37:45)]Well, the way that I would answer that is I believe AOL was one of the largest search engines early on and Google was not. And so we could ask in 1999 or whatever, "Should we just focus on AOL search and Yahoo search? Do we really need to worry about Google?"
And the answer is we don't actually know.[00:38:04)]It's very early, we don't know who's going to win. I do think that ChatGPT for sure is going to be large. Will Perplexity or Claude or these others compete with them? Probably. Just like search,
I think that there will probably be multiple winners and probably you'll need to optimize for several.[00:38:18)]I don't think that you'll need to optimize for 10,
but there'll probably be around three or so that will win that you want to optimize for. Lenny Rachitsky[00:38:26)]Okay. By the way, I want to make it clear, I love Claude. I use Claude and ChatGPT equally,
roughly. I didn't want to make it sound like ChatGPT is the only product people use.[00:38:34)]Okay. How does this strategy change depending on the kind of company you are? Say you're a B2B SaaS company or a consumer product, does anything in these seven steps change significantly?
Ethan Smith[00:38:44)]Let's take B2B,
for example. The first thing is that the citations that are being mentioned are going to be quite different. So citation optimization will vary quite a bit. Lenny Rachitsky[00:38:53)]Just to clarify what you just said, what do you mean when you say citation strategy is different?
Ethan Smith[00:38:57)]Meaning the citations that show up for B2B versus marketplaces are different kinds of citations. So for B2B, it might be like TechRadar shows up a ton when I ask questions. I've never read TechRadar, but for some reason it shows up all the time. I'm sure it's great. But TechRadar is showing up a ton for B2
B for whatever reason.[00:39:19)]In commerce, it's not going to be that, it's going to be Glamour and Cosmopolitan. For marketplaces, it'll be Eater and Yelp, TripAdvisor, places like that, so the kinds of citations that show up are different. Most of the stuff that I've been talking about is specific to B2
B stuff that's different for commerce.[00:39:38)]So for most B2B questions, the answers are not clickable. There's nothing to click on. And so if you actually want to measure the impact, you cannot just look at last-touch referral traffic. You have to see whether or not you showed up in the answer with tracking. And then you also need to ask the user, "How did you hear about us post-conversion to actually know the impact?" (00:40:00): So it's harder to track for B2B. Also for B2B, you're probably deciding which payroll management software to use after 50 touchpoints. With a brand, it's not going to be you just search for something, you suddenly spent $100,000 on payroll management software. So that's B2B. Commerce is different,
so Commerce actually now has more clickable cards like you would in a Google.[00:40:21)]So if you ask, "What's the best TV for apartments?" There are actual shoppable cards. Those shoppable cards are showing multiple sellers. Those sellers have rich snippets. Schema is important, the number of reviews are important,
so it's actually quite different. You can look at last-touch referral traffic to get a good sense about the number of conversions that you're getting.[00:40:44)]For commerce, similar with restaurants and hotels and local marketplaces, similar there. And then I would say early stage is also different. So I mentioned earlier, early stage my recommendation is don't do SEO at all. For Answer Engine Optimization, definitely do AEO, and only do citation optimization and long tail. Don't do any of the mid-SEO stuff,
just get cited and answer really specific questions. Lenny Rachitsky[00:41:12)]It's so interesting that so much of this is just showing up as the little tag/pill in the answer,
because it's obvious now that I think about it.[00:41:21)]That's the only way someone will get to your site from an LLM is just clicking that, "Okay, let me go read this article."
Ethan Smith[00:41:27)]Yes. But what they will do is they will open a new tab,
and they will type in the brand name and they'll go to Google.[00:41:33)]And then they'll click on your domain,
and you will think that it was a branded Google Search when it wasn't.[00:41:38)]Or they'll open up a new tab and they will type in your domain,
and they'll go directly to your domain and you'll falsely think that it was direct traffic. Lenny Rachitsky[00:41:46)]Coming back to a question you raised at the beginning. So for my newsletter, the fact that they're sucking up all this content, I don't even know how much,
and sending me some percent of traffic.[00:41:55)]Do you have any, I don't know, just sense of is this good? If you were running my newsletter, would you encourage all these outlets to suck up my stuff? And then be like, "Oh yeah, you could check it out in Lenny's Newsletter if you want"?
Ethan Smith[00:42:08)]Yes. And I would give the same answer that Brian Balfour gave on your previous episode on this, which is that it's not your choice whether to play the game. You are playing the game whether you want to or not, so you might as well try to show up. If you just say, "Don't look at any of my data,"
then you cannot show up and your competitors will.[00:42:27)]Now, what you can do is you can say, "I don't want you to train on my data, so you can index my site, but please don't train on my data." And they have different user agents for that and different bots, so you can just say, "And we're building a Webflow app to block training but not indexation." (00:42:43): Or you can just put it in your robots.txt, "This training bot not allowed. Index bot, you are allowed." So if you're concerned about that, I would suggest that, and I think probably a lot of people will do that. But saying, "You can't index my site at all,"
that doesn't make sense to me. Lenny Rachitsky[00:42:57)]Such a good point, because I don't know if I have competitors in this exact space,
Such a good point. Okay. Let me come back to the steps you shared just to see if there's something here that's worth diving into a little further. So this is essentially how to be more successful showing up in LLM responses. One is figure out what questions you want to rank for.[00:43:19)]And you could do this by looking at what your competitors are advertising and their paid ads and things like that. Just look at the terms, ask almost ChatGPT or Claude, "Turn these into questions people would ask to find these terms." Then set up a tracker to see just how you're doing today. How often are you showing up? (00:43:36): There's a million trackers, you have a link willing to check these out. Then you look at who is showing up today? Where are they being taken today? Use that to inform landing pages that you create to answer those questions better. And you make it very clear that it's very important not to just answer that main question,
but also follow-up questions. Then there's offsite stuff.[00:43:57)]So get into affiliates like Dotdash, YouTube, Reddit, Quora sounds like are the core, and then run an experiment. So you look at this tracker, and let me actually ask this, and the next step is just set up a team. But just to come back to this step, how do you set up an experiment that isn't just like a before, after? How do you do a control group situation?
Ethan Smith[00:44:19)]Yeah. So what I would do is I would take 100 different questions, half of them I will intervene, half of them I won't. Or let's say, let's take 200 questions. So 100 of the questions, I'm not going to do anything, so that's my control group. And we are seeing a fair amount of variance and answers just without doing anything at all,
so you definitely want a control group.[00:44:40)]And also we're seeing people are using LLMs more and LLM traffic is going up. So you definitely need a control group, especially in Answer Engine Optimization. So control group is, "Don't touch it at all, leave it as it is." That's the control group. Test group would be, "I'm going to now comment on Reddit threads, so let's test that." (00:44:57): Or I'm now going to make a YouTube, Vimeo video, or I'm now going to pay Forbes advisor to say that I'm the best credit card. Maybe break those up into a few different buckets, track them. Have a couple of weeks before, a couple of weeks after,
compare against the control group.[00:45:11)]And then the stuff that went up when the control group did not worked, and the stuff that didn't did not, and then reproduce it. So reproducibility is very important. And my background's in academic research, and it's common to do a study that cannot be reproduced. And so for something to truly be accepted with an academian,
it needs to be reproducible.[00:45:34)]Meaning multiple people have done this study and reproduced that thing over and over again. And especially in SEO, it's common for something to change. And you think that it was this thing that caused it and it's actually not,
and you just assume forever that that works. So reproducibility is very important.[00:45:49)]Try to do that study multiple times, try to get studies from other people, and if it works 10 times, then it probably works. And this comes back to the waste problem, most work is wasted in SEO. Most work is wasted in AEO, so how do you know what's not wasted? You do an experiment,
you don't assume that what you read online is true.[00:46:07)]You do your own experiment, and then you reproduce it multiple times,
and keep doing the stuff that works and don't do the stuff that doesn't. Lenny Rachitsky[00:46:15)]It feels like such a big deal to win at AEO. Just coming back to this idea that people are coming to ChatGPT, Claude,
Gemini looking for an answer.[00:46:25)]If you're that answer, I feel like that could just make or break your company. It feels like even more important than SEO,
just getting this right. Ethan Smith[00:46:33)]I would say that where I want to get the most conversions possible, how big is the channel? The channel is not as big as search. The search is definitely larger, but it is a substantial channel now. And Webflow, they get 8%
of those signups from LLMs.[00:46:50)]It's now one of your top channels so it's large. It's not the largest channel, it's not the number one channel. Paid is probably the number one channel,
but it's definitely a substantially large channel and one worth optimizing for. Lenny Rachitsky[00:47:02)]And as you said,
Yes. Lenny Rachitsky[00:47:05)]Okay. Let me zoom out a little bit,
and let me just ask you this.[00:47:08)]What do you think are maybe the most surprising or underdiscussed topics when it comes to AI and SEO and AEO that we haven't already talked about?
Ethan Smith[00:47:18)]The first thing is that there's significant misinformation on AI and on AEO,
and it's pretty extreme. It's unusually the percent of misinformation to correct information is pretty substantial. So one example is every two years there's news articles about how Google Search is going to die or it is dying because there's a new thing.[00:47:42)]So that's happening right now with AI Overviews and with AEO, Google's going down, which is not true. Before that it was TikTok search, so everyone is using TikTok now. Gen Z is using TikTok, they're never going to use SEO. SEO's going to be dead, and so you really need to focus on TikTok search, which is not false. It's not untrue, but it's not taking share away from Google,
it's just a new surface.[00:48:05)]And then before that it was Instagram, and then before that it was Facebook and it was YouTube. And people do search and discover on Instagram, TikTok, YouTube, but it doesn't take away from Google Search. It adds on top of it. These are all new channels, so Google's slice of the pie stays the same,
the pie gets bigger.[00:48:23)]And so misinformation about Google going down, Google is not going down. Google published something recently, their VP of search explicitly said, "I looked at the traffic that we're sending to publishers, and it is not down, it's up slightly."
So it is not true that Google Search is going down.[00:48:37)]And most of the news information about that is saying that it's going down, so that's the first surprising thing. The second surprising thing is tooling. And I've never seen a channel where these extremely expensive tools that essentially do commodity tasks. So imagine if I said, "I'm going to charge you $50,000 for keyword tracking." (00:48:59): You would say, "Well, of course, that's absurd. It's keyword tracking, I could write this in a day." No one would do that. But for answer engines, it's mysterious and people don't really know how it's working. Also, the slope of the growth curve is so significant,
that I'm seeing people spend huge amounts of money on what are essentially keyword tracking commodities.[00:49:19)]So that's the second thing. The third thing is the growth curve of the channel. And we did a Reforge AEO webinar a year ago, and there was excitement and then it died and there was very little excitement about it. This was in June, and then people didn't really care. They were intrigued intellectually by it,
but they didn't care because they didn't see the impact from that.[00:49:40)]So there was essentially very little interest between July and January, and then suddenly in January it's just skyrocketing. So it's ChatGPT launches, people are very interested, and then it's not that interesting for growth people. And then there's this little spike in June, and then it's like this,
which is usually not what you see with a new channel.[00:50:01)]So the slope of the curve is unusually steep, and the shape of the curve is also very unusual. The last is that a lot of people do think that SEO and AEO are different and they're not different. I think probably part of that is because it sounds great to say that there's this new channel,
it's completely different.[00:50:23)]And I'm an expert and I have a tool to sell you, and it's totally unique and all these other tools are not relevant. In reality, it's actually there's quite a bit of overlap. There is the difference of the citation optimization. The head is different and the tail is different,
but the core technology is pretty similar. So those are probably the most surprising things. Lenny Rachitsky[00:50:43)]This piece about January being the inflection point, you mentioned that it was because references started showing it more prominently. Is that the big change?
Ethan Smith[00:50:51)]I think it's increase of adoption of LLMs by people, so it's just actually growing more and then the clickability. And I am seeing,
you are seeing now this large increase of actual clicks.[00:51:02)]Probably before you got no clicks, even if you showed up an answer,
so the clickability of the answer has increased.[00:51:08)]Especially for things like commerce and local and hotels, because they have these rich modules where you can click on stuff and go somewhere,
which was not true before. That and I think people are just using LLMs more. Lenny Rachitsky[00:51:20)]Ethan, let me just say, I'm learning so much from this conversation, what a fun thing. I could see, it's just clear how much you love this stuff, and just how nerdy and deep you get into it. And it's just fun to talk to someone that's so deep and knowledgeable about all these things,
so thank you for sharing all this with us.[00:51:36)]I'm going to go in a slightly different direction. There's this whole world of AI content, people generating content with AI, generating landing pages. Just like, "Oh my God, SEO is never going to just generate all this stuff. AI is going to make all this stuff easier." (00:51:48): You guys did a really big study on how that works, whether it's a good idea to generate content with AI. Can you just talk about what you learned from that, and how people should think about AI in generating content?
Ethan Smith[00:51:59)]Yes. So I remember when ChatGPT launched and Brian Balfour posted on LinkedIn, "What do you people think that is going to happen from ChatGPT and AI?" And my immediate response is spam, so just lots and lots of spam, especially SEO spam. And then there was a whole industry around AI-generated content,
and I knew immediately that it wouldn't work.[00:52:20)]And the reason why I knew it wouldn't work, and when I say AI-generated content, I mean automated content with no human-in-the-loop. So I think that the future of content is clearly AI-assisted. Clearly, you and I will be using AI to help us write, so it's not no AI at all, but it's not 100%
generated with AI. I immediately knew that it wouldn't work.[00:52:38)]Why did I know that? I knew that because I created spam in 2007, and I knew what Google did about it and how, and I knew the exact same thing was going to happen. So what I did in 2007 is I and all the other shopping comparison people scraped all each other's content, reviews, chopped it up, scraped content, 100 million search pages, snippets,
and it worked really well.[00:53:02)]And then it stopped working, and then all those companies disappeared. I knew that was exactly what's going to happen with AI-generated content. And so from the beginning, I've not focused on AI-generated content. Many people have, but I don't know,
so maybe it does work. There's lots of case studies about it working.[00:53:20)]So let's do the study, let's do an analysis. So we took, we looked at both Google and at ChatGPT where we took thousands of searches and thousands of questions, and we put those searches into Google Search. We put those questions into chat and the ChatGPT,
and then we looked at the citations or the Google Search results. Then we looked at an AI detector.[00:53:43)]So we used Surfer SEO's AI detector. Now, when I tell people this, they say, "Well, you can't detect AI." So then we evaluated the efficacy and the accuracy of the AI detector. So we did that by generating thousands of AI-generated articles and it was very predictive. And then we looked at real articles,
we did that two different ways.[00:54:05)]One way is we write real articles, and the other is we took a random sample of 100,000 URLs from Common Crawl over the last five years. And then we looked at the AI detector before ChatGPT was launched, so it necessarily was content not created by a human. And then the false positive rate was around 8%,
so basically the AI detector is very accurate.[00:54:29)]So we took that, then we ran it on the content. So then what we saw was around 10% to 12% of content in Google Search, and then ChatGPT or AI-generated, 90%
are not. And we ran a correlation analysis showing the exact same thing. So we essentially did a very rigorous study showing that AI content does not work. AI-assisted content edited is great.[00:54:52)]We do that sometimes, other people do that, that is clearly the future of content. So that does work and should work and that's good, but purely 100% AI-generated does not work. So then the second thing that we did was we found that, this was unexpected,
but we found that there's more AI-generated content on the internet than human-generated content.[00:55:13)]So back to the Common Crawl study, we looked at 100,000 different URLs over the past five years. And then you can see this curve where AI-generated is now higher than human-created. So there's more AI-generated content on the internet than human-generated content,
which is disturbing. So then let's say that AI-generated content did work.[00:55:32)]If AI-generated content worked, then everyone would do it. Just like in 2007, shopping comparison sites, if I can scrape my content, why would I pay anyone to write it? I'll just scrape it from you and I'll chop it up. So then everyone will do that,
and then it will go from most content is AI-generated to almost all of the content is AI-generated.[00:55:51)]Then what will happen if that works, is that Google now becomes a search engine for ChatGPT responses. So if Google's a search engine for ChatGPT responses, there's no reason for Google to exist. Just go to ChatGPT, which is the exact same thing that happened in 2007. (00:56:04): Google said, "I see all these shopping comparison search engines showing up in my search results. So I'm essentially a search engine for search engines." I should be showing the TV in my results. I shouldn't be showing other vertical search engines,
so I'm going to get rid of them and I'm just going to go straight to the product.[00:56:23)]The same thing will be true for ChatGPT. Now for ChatGPT, let's say that ChatGPT ranks its own derivatives in its citations, so then you have this infinite loop of derivatives. So I go to ChatGPT, I say, "Generate 10 articles." I put those articles into the citations and then I say, "Summarize these citations that were derivative." (00:56:41): And then I keep on doing derivatives of derivatives, and then you have an infinite loop of derivatives, and now AI is summarizing itself. There's a paper about this called Model Collapse. So again, there's the core algorithm and then there's the RAG piece. So the core algorithm, a group did a study showing model collapse, which was what if you feed in AI derivatives into the model and train the core model on the derivatives? (00:57:06): And then what happened was you had all these problems, hallucinations, things break very quickly. Okay. So then we did a study on what if you feed derivatives into the RAG piece? So generate 10 derivatives, put that into RAG, summarize that. And then generate 10 more, and then summarize my summarizations, infinite loop of derivatives. What happens? (00:57:25): And so what happens is there's a wisdom of the crowd. The LLM is summarizing the opinion of many people. So if you ask a question like, "What's the best flavor of ice cream?" There's not one answer, there's thousands of opinions. So the LLM is summarizing these many,
many opinions in this wisdom of the crowd.[00:57:40)]And the wisdom of the crowd basically says that, "If you take the average of a large group of people, their average response will be better than the best single individual in the group." And so it's better to have more diversity of opinions, wisdom of the crowd. So what happens to the infinite loop of derivatives?
You essentially converge on one opinion.[00:58:00)]So if you ask, "What's the best flavor of ice cream?" It will eventually say, "It's vanilla and it's only vanilla, and there's no other flavor of ice cream." And so that's a simple example, but if you feed in derivatives of derivatives into the model,
you'll basically take the wisdom of the crowd.[00:58:15)]And that will shrink and you'll have a single opinion on everything, which is really bad. So that's what happens if AI content, 100%
unassisted AI content works. Lenny Rachitsky[00:58:25)]I'm afraid of this world where everything is trained on AI, and AI is trained on AI and generating AI,
and just like nothing is trusted. And I love how it's interesting just how much of these incentives are driving this.[00:58:36)]If ChatGPT was finding this valuable, this is what people do and then just goes off the rails. So there's just some team there that is keeping this from happening. How do you think this evolves? (00:58:46): If you were them, what would you do over the next few years to keep things high quality and not drive these perverse incentives?
Ethan Smith[00:58:55)]So I would identify what the perverse incentives might be, and AI-generated content is one of them. The second thing is I think that LLMs and search are going to converge. And so you're seeing that with Google Search where they're having LLM, AI Overviews. You're seeing that with LLMs where they're incorporating maps and shopping carousels,
and it's converging on search.[00:59:14)]I think it'll converge on a single experience, so that's the first thing. Figure out what 2007 Ethan would do not to create spam and make sure that he doesn't do that, like AI-generated content or it's great content. That'd the second thing. And the third thing is there's all these other interesting features,
use cases that LLMs can be great for.[00:59:33)]So LLMs could be great for remembering everything that you've ever asked. It could be good for personalizing stuff specifically to Lenny. One interesting use case that I think will eventually come would be, I say, "Plan a trip to San Francisco,"
and decisions are made for you without any intervention. I have this wonderful EA named Jen.[00:59:51)]And I say, "Jen, I'm going to Miami. Please, just do everything for me," and she does everything for me. She knows me, she knows my preferences, she knows that I want a ocean view and I want a restaurant with music. She does all of that and I don't have to intervene. AI can essentially do that eventually,
and that would do that because it would deeply understand you.[01:00:09)]It would remember everything about you. It would have context, it would have a reasoning, and then it would be able to make all these decisions without your intervention,
which would be autonomous agents. So I think that that's also another very interesting place for someone like me to optimize for as well. Lenny Rachitsky[01:00:25)]Yeah. I was just going to say, just imagine not even being told this is what you're choosing. Like, "Oh, and go check out, subscribe to the best newsletter out there." And if you're out there,
the good things will happen.[01:00:36)]Wow, what a wild world. Is there anything else that we haven't covered that you think would be helpful to folks that are trying to get better at this stuff? Try to take the first steps down this road of AEO?
Ethan Smith[01:00:49)]Yes, the most exciting topic,
Sweet. Ethan Smith[01:00:54)]So I mentioned that people in chat are asking follow-up questions. They're looking for tools. Do you have this feature, this use case, this integration? And that frequently can be answered in help centers. Usually, you would not have an SEO team and say, "We really want you guys to focus on the help center." (01:01:14): But in chat, since there's all these questions about can you do this thing, can you fulfill my use case? A help center is actually a great place to do that, and so I think how can you optimize the help center? So number one is it's frequently on a subdomain. For whatever reason, subdomains don't work as well as subdirectories, so move it to a subdirectory,
number one.[01:01:33)]Number two is make sure that you're cross-linking well. So usually you do not have optimized internal links, so link from help center page to help center page, make sure there's lots of cross-linking. The third is you probably have help center content about the head,
but the tail you probably don't have any help center content for.[01:01:51)]So an example of this is I was looking for, I wanted to track our sales calls and look to see who was in the meeting and what the sentiment was. And I wanted to put that into Looker, so I said, "Which meeting transcription tool integrates with Looker?" And the answer is none of them,
but you could use Otter because Otter has a Zapier integration.[01:02:15)]You could send a Zap of the meeting, put it into BigQuery, and then do Looker on top of that. But there wasn't a help center article about that because it's a very obscure use case, but it's not a zero use case. And so the tail,
there's going to be a bunch of questions in the tail that you may not have help center articles for.[01:02:31)]So again, what are the questions in sales calls? What are the questions that you're seeing in customer support? Having pages for that,
I might even open up to the community. Anyone can ask anything because the community will then fill on the tail and then answer those.[01:02:46)]And again, in many cases there might be nobody talking about this at all. So you could be the only citation for this,
and then win that tail of questions. Lenny Rachitsky[01:02:55)]Are there any help desk, I don't know, system software that are just making this easier yet? Or do you think that's an opportunity for, say, Zendesk or Intercom?
Ethan Smith[01:03:02)]I think probably all of them should work perfectly well. I think that the only thing you need to do is cross-linking and subdirectory rather than subdomain,
which probably most of them do. So I think that they should all work for free.[01:03:12)]That the main thing you would want to do would be, again,
open it up to the community and make sure that you fill in the tail. But probably all those tools should be good for this. Lenny Rachitsky[01:03:19)]Well, with that, we've reached our very exciting lightning round. I've got five questions for you, Ethan. Are you ready?
I'm ready. Lenny Rachitsky[01:03:25)]What are two or three books that you find yourself recommending most to other people?
Ethan Smith[01:03:30)]Number one is Emotional Intelligence, and people talk about the concept of emotional intelligence, but there's actual research and psychology around that. I believe it was published in the '80s, but there's a really good book that summarizes the foundational research around emotional intelligence. And it's very useful when building relationships and communicating with people to understand their emotions. So that's the first one. And doing growth because growth is getting people to use your stuff. And so if you have frameworks to inform how people will use your things, then you can be a more effective growth person. Which brings me to my second book, which is Cialdini's Persuasion book. Robert Cialdini does a bunch of books around persuasion, but again, there's frameworks for how to persuade somebody to sign up, buy something. And so he breaks down his framework for that, and again, it's based on psychology. And I think especially in growth,
there's all kinds of psychology research and behavioral economics research to inform tests.[01:04:25)]And if you just read Thinking, Fast and Slow, Persuasion, Emotional Intelligence,
They give this example of they wanted to measure how good an orchestra conductor was and they could survey or they could see the number of standing ovations for each orchestra conductor. And the more standing ovations probably means it's this better one and that you don't need to survey people.[01:04:58)]But much of growth and business is things that are not immediately obvious for how to measure, but anything could be measured,
and so that's my third record. Lenny Rachitsky[01:05:06)]Is there a favorite recent movie or TV show you've really enjoyed?
Ethan Smith[01:05:09)]I don't really watch TV, but I watch two different groups of things. I watch really aggressive sports, so I really like Michael Jordan documentary, Last Dance. I like Lance Armstrong documentaries about how aggressive and confrontational he is,
and I love watching UFC. I like extreme aggression and intensity. The other group of stuff that I like to watch are climbing documentaries.[01:05:34)]So anything that Alex Honnold, Jimmy Chan do, I watch all that, which is the exact opposite of aggressive sports. So it's zen, being present, slow-and-steady craftsmanship. But this is how I approach my work, which is extreme intensity and aggressiveness, and then the zen craftsmanship,
being present. Lenny Rachitsky[01:05:57)]I love how this explains why people love working with you and why you're good at this,
is like this competitiveness and also just the super nerdiness to get really knowledgeable about how this stuff works.[01:06:09)]And then I didn't think about the zen element of it,
just lik staying calm throughout it all. Ethan Smith[01:06:13)]Flow,
flow state. Lenny Rachitsky[01:06:15)]Flow,
Thank you. Lenny Rachitsky[01:06:19)]Okay, I'm going to keep going. Do you have a favorite product you've recently discovered that you really love?
Ethan Smith[01:06:24)]This camera and this microphone. So I got a Sony mirrorless SLR, I forget which one. But, sorry, getting a mirrorless SLR with a wide-angle lens really transforms your video calls. And then I have this Shure microphone and I think it's like $180. (01:06:46): This dramatically improves the quality of my video call. And I like to design things and you can design your video calls and you can make them amazing. You can have flowers in the background, over here,
So my favorite products are my SLR camera that I use for video calls and my microphone. Lenny Rachitsky[01:07:07)]Your background is quite exquisite and I didn't mention that, but it looks beautiful. Okay,
two more questions.[01:07:13)]Do you have a life motto that you find really useful in work or in life?
Ethan Smith[01:07:19)]There's the Outliers book about 10,000 hours. And the themes there are you don't have to be the smartest, you have to be sufficiently smart, number one. Number two is focused practice, so it's not just trying hard, it's doing it in an intentional, focused way. And the third thing is lots of practice,
so no one can master anything because they're a genius.[01:07:44)]They master it because they spent a significant amount of time practicing and they practice in an intentional way. And so my motto is essentially a combination of those things,
which is that I'm not going to necessarily win because my brain is the largest brain or that I tried the hardest.[01:08:00)]It's because I'm going to be the most intentional about my practice,
and I'm going to be as intense as I possibly can be about that practice. Lenny Rachitsky[01:08:08)]Okay, final question. I'm curious if there's just like an SEO or even an AEO win, you're just most proud of? (01:08:15): That you always think about, "Wow, I can't believe I pulled that off. I can't believe the impact we had there"?
Ethan Smith[01:08:19)]I always liked the example of butter lettuce with MasterClass. Because MasterClass, when I was first working with them,
they did not have nearly as much authority as Allrecipes and Martha Stewart. And I actually didn't know if I should take the project because I thought it might be too hard.[01:08:37)]But I did the project and it was hard, but we were able to rank really competitively and way better than I expected. And I think it's probably because of all these specific, little execution details. But butter lettuce was my favorite one, and I like butter lettuce,
so I can search for butter lettuce and I can get a recipe on MasterClass. Lenny Rachitsky[01:08:55)]That's amazing. I don't know if butter lettuce has been mentioned on this podcast before. Ethan,
this was incredible. This was everything I was hoping it'd be.[01:09:03)]I feel like we've just leveled up everyone's knowledge on what the hell is happening with SEO and AEO?
Forget about GEO.[01:09:09)]Two final questions, where can folks find you if they want to potentially work with you guys? And how can listeners be useful to you?
Ethan Smith[01:09:15)]So where you can find me, number one, is on LinkedIn. I spend lots of time on LinkedIn and I publish original, so we do original research. We have a whole research team hypothesizing and evaluating those hypotheses. So we publish, all the studies that I mentioned,
we publish on our site and I publish them on LinkedIn.[01:09:33)]So follow me on LinkedIn, add me on LinkedIn, send me a message. LinkedIn, number one, and then number two is we have a blog which we call The 5%. So /5%, which stands for 5% of work, 5% of landing pages drive almost all the impact, so that's the theme. This is only useful stuff. So our blog at 5%, you could subscribe to our email and to our studies. And then how can people be useful to me? (01:09:58): So I spent time thinking about this and there's two ways people can help me. The first way is that there's not that much research around what works in AEO, and I would love to know what people are testing and what the results are and what works. So people doing studies and publishing that are sending it to me, I would love as much analysis and research as possible,
number one.[01:10:20)]Then the second one is to help me on LinkedIn by commenting on my posts and on my comments. So you posted most recently the Brian Balfour episode, for which I wrote a long, thoughtful comment, and then I got about 25
likes and then I got responses to that. And so I've been commenting on other people's LinkedIn posts and I've been writing these long LinkedIn posts.[01:10:43)]And when people comment, it boosts the engagement within LinkedIn and then I get mass distribution. So the more people and thoughtful comments, so not this is great, but a long, thoughtful comment that stimulates conversation. So if people comment on my posts,
then I'm just going to blow up on LinkedIn and I might be as big as you someday. Lenny Rachitsky[01:11:01)]I love how tactical his ask is. It's something Bryan Johnson I noticed is really good at on Twitter,
Yes. Lenny Rachitsky[01:11:17)]Also, just to point people to your domain, graphite.io, is that the right domain?
Yep. Lenny Rachitsky[01:11:21)]Amazing. Ethan,
Absolutely. It's good to be here. Lenny Rachitsky[01:11:29)]Bye, everyone. Thank you so much for listening. If you found this valuable, you can subscribe to the show on Apple Podcasts, Spotify,
or your favorite podcast app.[01:11:38)]Also, please consider giving us a rating or leaving a review, as that really helps other listeners find the podcast. You can find all past episodes or learn more about the show at lennyspodcast.com. See you in the next episode.