Artwork for podcast Lone Wolf Unleashed - avoid exhaustion, reclaim your time using tools, systems and AI
Efficient AI Knowledge Wiki Strategy
Episode 3414th April 2026 • Lone Wolf Unleashed - avoid exhaustion, reclaim your time using tools, systems and AI • Mike Fox
00:00:00 00:16:33

Share Episode

Shownotes

Are your AI tools burning through tokens faster than ever? You're not alone — and in this episode I'm sharing the framework that's changed how I manage knowledge and query AI at scale.

I'm Mike from Lone Wolf Unleashed — I help solo founders build business systems so they can switch off sooner and live larger. Today I'm walking through Andrej Karpathy's wiki methodology, how to implement it in Obsidian, and why it reduces AI token consumption by up to 85% compared to traditional approaches.

I also cover how I've applied this directly to Lone Wolf Unleashed — building a target operating model, setting up agent teams in Paperclip, and designing a content production architecture that closes the gap between production and distribution for a solo operator.

If you're hitting your AI limits, spending too much on token-heavy workflows, or just looking for a smarter way to manage your business knowledge — this one's for you.

──────────────────────────────

Chapters

──────────────────────────────

00:00 — Introduction: Token limits and why this matters for solo founders

00:43 — How Andrej Karpathy's wiki methodology works

02:50 — Setting up a wiki ingest workflow in Obsidian

03:33 — Why the wiki is 80–85% more token efficient

04:38 — Visualising knowledge connections with Obsidian's graph view

05:56 — Replacing a team of analysts as a solo operator

06:43 — Target operating models and the 5Ps framework

07:50 — Introducing Paperclip and automated content production

10:13 — Building an AI Business Analyst assistant

13:00 — What this means for your business and your life

14:21 — Resources and wrap-up

──────────────────────────────

RESOURCES

──────────────────────────────

Wiki resources and setup guide: https://lonewolfunleashed.com/resources

──────────────────────────────

CONNECT

──────────────────────────────

Website: https://lonewolfunleashed.com

Email Mike: mike@lonewolfunleashed.com

Mentioned in this episode:

You might also like...

Check out the "Websites Made Simple" podcast with Holly Christie at https://websitesmadesimple.co.uk/

This podcast is part of the Podknows Podcasting ICN Network

Transcripts

Speaker:

G'. Day. My name is Mike from Lone Wolf Unleashed and today we're going to

Speaker:

be talking about what to do with these token limits in

Speaker:

our AI tools and maybe some ways that we can start to

Speaker:

capture information and very efficiently query information

Speaker:

that are in our repositories. And this is relevant

Speaker:

for the solo founder because when we go to get the information

Speaker:

out of our heads, it has to go somewhere. And when we need to go

Speaker:

and query our knowledge base, we want to be able to do

Speaker:

that in a very cost effective way. So I'm going to take you

Speaker:

through that and that's going to include Karpathy's

Speaker:

wiki methodology. That was. There's a lot of talk

Speaker:

about it this week, so we're going to go through that. So

Speaker:

strap yourself in. Here we go.

Speaker:

This past week I have hit my Claude limits

Speaker:

the fastest I ever have. And there has been some chatter

Speaker:

around about how Opus particularly has been degrading.

Speaker:

And there's some challenges there around how many tokens

Speaker:

are being consumed to do the same tasks that we used to do.

Speaker:

And I have certainly found that, yes, I have been

Speaker:

working on some other things that have consumed more tokens, but not my

Speaker:

normal activity has been, has not really changed that much.

Speaker:

And I, I have seen that I've been going through the tokens more

Speaker:

at a higher rate. So something

Speaker:

that's got me thinking is that because I was already on a max

Speaker:

plan, I was sort of, I was spoiling myself, I

Speaker:

was keeping myself in Opus a lot because a lot of what I do

Speaker:

is a lot of analysis work. However, I've started to break

Speaker:

down the different tasks I'm doing into different models.

Speaker:

And part of why that is is because I've started to hit these

Speaker:

limits and I don't really want to have to spend a lot more money every

Speaker:

month to try to maintain my AI use.

Speaker:

So something that came out this week was

Speaker:

the AI guy, Andrej Karpathy, same guy,

Speaker:

same guy that coined the term vibe coding. And what

Speaker:

he's done is he's been working on how to

Speaker:

effectively query large information sets

Speaker:

with very specific information, have the AI retrieved that

Speaker:

in a very efficient way. So what he's done is he's

Speaker:

created a wiki type

Speaker:

setup in Obsidian that you can do

Speaker:

and the tweets are there that you can go and refer to. And I'm going

Speaker:

to have a resource on my website that can help you install this in

Speaker:

your own ecosystem. Basically, when you have a knowledge source,

Speaker:

let's say I do this a lot with YouTube now is. I'll look at the

Speaker:

YouTube, I'll pull down the transcript, has a lot of the information there.

Speaker:

It has the embed of the YouTube thing so I can watch it again. I

Speaker:

can pull this via the Obsidian web clipping

Speaker:

browser extension. And so when I use that, it will

Speaker:

insert a new note with that content into my

Speaker:

wiki inbox. I can use a command in Claude that

Speaker:

says wiki ingest. And it will ingest that article, it will create

Speaker:

a summary, it will create stubs. Stubs are the

Speaker:

references to other files of similar topics

Speaker:

that might be referenced throughout. And then it is

Speaker:

basically there for me to use. There's another command which

Speaker:

is wikilent, which is basically going through and seeing what stubs you have that might

Speaker:

have a lot of incoming references. And you can sort of build up your

Speaker:

knowledge ecosystem over time. Why is this useful? Well,

Speaker:

at the moment, a lot of people traditionally in organizations

Speaker:

will build up really big procedures and they'll have all their information and their

Speaker:

processes and things, and there'll be a whole bunch of files that you have to

Speaker:

start to manage. What this does is it allows the AI to be

Speaker:

able to go in, organize all of your information and to build

Speaker:

those connections between different places, regardless of where in your

Speaker:

ecosystem they are stored, and allows it to do that

Speaker:

automatically. We don't have to sit there and figure out which text needs to

Speaker:

link to which part of the ecosystem. And it allows us to basically

Speaker:

create a space for us to query the

Speaker:

AI against the knowledge base. So it'll go in, it'll look at the

Speaker:

index, it will see if there's anything in the index, it

Speaker:

will look at our summary pages, and then if it needs to go deeper, it

Speaker:

will then go and hit the main pages of those summary pages for

Speaker:

more information. This is so much more token

Speaker:

efficient. It's about 80 to 85% more token efficient

Speaker:

than before. Why is that? It's more token efficient

Speaker:

because it doesn't have to read the entire markdown page

Speaker:

every time it needs to retrieve information. You're

Speaker:

basically creating a filter or a funnel for it to go

Speaker:

down, or reverse funnel in this point, query a little bit, get

Speaker:

some information, get some direction, query to the next layer, get

Speaker:

some more information, query the next layer, and it goes down. That's

Speaker:

excellent because we can take our resources, we can take

Speaker:

our information, we can look at our procedure pages, we can ingest them, we can

Speaker:

codify them. It will store them in different domain files.

Speaker:

It's, it's all set up there so it can be easily

Speaker:

stored and queried. Now, the joys of doing this in

Speaker:

Obsidian is I can now look at all the different

Speaker:

stubs, the different connections between the different notes,

Speaker:

and I can pull up my graph view and I can see

Speaker:

how the different topics and sources and

Speaker:

articles are all related to each other. Now, this is

Speaker:

an absolute game changer because previously, let's say five years ago

Speaker:

when I was in an enterprise role, a lot of manual work from a team

Speaker:

of analysts would go into a process management system, for example.

Speaker:

Now, all those connections had to be made manually.

Speaker:

And to be able to get visibility on things you would have to go through,

Speaker:

you'd have to drill down, you'd have to find the right file in the right

Speaker:

place. You don't have to do that anymore. I

Speaker:

can do this solo for an entire organization now. I

Speaker:

don't need a team of analysts, I don't need to manually make all

Speaker:

those connections anymore. So being able to have a visibility or

Speaker:

visualization tool to help us understand the connections between different

Speaker:

tools and notes of the ecosystem is

Speaker:

very, very helpful. It helps us really literally connect the

Speaker:

dots on how things within your business are connected.

Speaker:

And this leads us into the next thing that I've been

Speaker:

working on. My work, I feel like, has accelerated a

Speaker:

ridiculous amount recently. And this is how

Speaker:

we actually form up target operating models. Now, I know this is

Speaker:

a little bit of a pivot away from wikis and

Speaker:

things like that. However, if we zoom out of your business

Speaker:

and we come up with what I call my 5Ps framework, the

Speaker:

profile. The profile is basically a very high

Speaker:

level view of your business and it is basically

Speaker:

what you end up with is a target operating model.

Speaker:

Now, the target operating model basically paints a picture of what your organization

Speaker:

looks like, what the key players in that organization are, who

Speaker:

the key users are, key processes, systems,

Speaker:

stuff like that, all on a page. So I've just done this

Speaker:

for Lone Wolf Unleashed, for the social

Speaker:

components of it, for the community

Speaker:

and program parts of it. And basically

Speaker:

what I'm able to do now is codify what that target

Speaker:

operating model is. And I can now insert that into an

Speaker:

agent team to plan out how I can automate as much of that end to

Speaker:

end as possible. Now, why is this important? Well,

Speaker:

if I take that target operating model and I create an architecture spec,

Speaker:

I can now feed this into my wiki and I can know exactly where I'm

Speaker:

up to. I can see how decisions that I've made in the past

Speaker:

can affect what I might be doing in the future. I can See how I'm

Speaker:

going to break down this plan about how to make this particular state

Speaker:

become true. It is amazing. It is simply amazing.

Speaker:

And the things that I'm going to do now

Speaker:

is what I've started already is I've started to build this into a

Speaker:

tool called Paperclip. Now, Paperclip was brought out last month

Speaker:

and I'm going to be talking a lot more about this tool in future

Speaker:

episodes because I think it's so, so powerful about how we can set

Speaker:

up solo businesses to deliver epic amount of value.

Speaker:

What I've done is I've fed it the target operating model and description and

Speaker:

architecture and I've set up agent roles to be able to

Speaker:

help us build this out. What we're going to be able to do is have

Speaker:

agents go out to APIs, we're going to be able to take

Speaker:

a podcast episode, we're going to have it automatically clip, we're going to have it

Speaker:

automatically generate the, the descriptions and all that sort of stuff.

Speaker:

That's amazing because an amazing amount of work goes into managing all

Speaker:

that type of content. And so what we're going to end up with is

Speaker:

a state where the, the arbitrage that exchange,

Speaker:

that direction, that middleman between production of content

Speaker:

and distribution is that gap is going to be

Speaker:

closed.

Speaker:

The time that it will take and the time that it takes to manage

Speaker:

that piece is going to be dramatically reduced. Now,

Speaker:

what does this mean? It means that I, as a solo operator, am now able

Speaker:

to go out and I can have more meaningful conversations with

Speaker:

my prospects. It means that I can focus more on the community

Speaker:

building side of things rather than the delivery side

Speaker:

or the content side, the marketing side. That's an

Speaker:

amazing result that I want to be able to get

Speaker:

myself towards because it will mean that I don't have to keep going out

Speaker:

and hiring people to do stuff like that.

Speaker:

Their time can be freed up to do other things and my time

Speaker:

certainly can be freed up to do other things. So.

Speaker:

And again, is this all coming back into that wiki space? The wiki

Speaker:

space can be used for any big piece of work, any

Speaker:

persistent knowledge. And so if I've got

Speaker:

a spec or a build that I'm going through, if I've got general

Speaker:

procedures that I need to look at or a strategy, I can see how

Speaker:

now all of those things connect together and I can see how I'm on track.

Speaker:

What this has led me to think now is the

Speaker:

participants in my program have said, hey Mike, you're really, really good at

Speaker:

translating between what the business needs and how we actually

Speaker:

go implementing that in it. And I see this with a lot of clients

Speaker:

is there's often a mismatch in the languages that

Speaker:

both sides of that equation use. And so how I sit in that

Speaker:

equation is a translator between each. Hey, when the business

Speaker:

user says they do this, this is what they mean and this is what it

Speaker:

can look like in the platform. And so

Speaker:

what I've done with that is there's even a lot of big businesses out there

Speaker:

that say, hey, we don't really have the budget available for a ba.

Speaker:

And I can see how their systems operate. They

Speaker:

definitely have not invested in business analysis because the

Speaker:

platforms don't talk to each other, their users are very unhappy with them, their

Speaker:

technology department isn't delivering very well and it's because they don't have

Speaker:

the information that they need. Right? So everyone in that

Speaker:

scenario has been set up to fail. And so what

Speaker:

I've done now is I've gone, how do I make this better?

Speaker:

Better? I can make this better by codifying exactly the things that

Speaker:

I need to do to deliver results. How do I

Speaker:

create a digital AI BA assistant

Speaker:

that a subject matter expert can sit alongside and

Speaker:

it can prompt and it can create requirements

Speaker:

documents and it can do workshop plans with the right

Speaker:

questions to ask and, and all those things, all of

Speaker:

that will have an inbuilt wiki which you can use to

Speaker:

track the requirements, the constraints and the questions and things, the

Speaker:

outstanding items all within there.

Speaker:

So this is why it's so powerful, is basically you can create under any

Speaker:

application within the knowledge ecosystem as

Speaker:

a knowledge worker, and you can create an agent team that

Speaker:

utilizes a wiki automatically to be able to deliver

Speaker:

you more results and still be able to tell you exactly where things are

Speaker:

up to. So what does this mean? There's been so

Speaker:

much I've covered in the last 15 minutes. What does this mean? I would like

Speaker:

you to go away this week and I would like to think about some processes

Speaker:

that you're using your knowledge for and how

Speaker:

AI might be able to speed up the outcomes of what that looks like.

Speaker:

What we're really looking to do is we're trying to level up

Speaker:

our use of AI to manage these administrative

Speaker:

burdensome tasks so that we can focus more on the

Speaker:

really the things that matter. What are the things that matter?

Speaker:

Relationships, client conversations, asking

Speaker:

good questions, networking. All right,

Speaker:

what are the things that allows us to do outside of the business? Family

Speaker:

time, time with your spouse, time with your partner,

Speaker:

time with friends, more time exercising the really

Speaker:

meaty stuff that you know that you want to do more of, but you're working

Speaker:

too much to do right now. So have that

Speaker:

in mind. This is possible. We are in a space now where this

Speaker:

is possible. We want to be able to be in a space where we can

Speaker:

do this efficiently. Right. We're getting into the to the crux of this now.

Speaker:

AI models are getting more expensive to use. We need to be token

Speaker:

efficient and we need to be able to start to leverage

Speaker:

this technology that is in front of us now to be able to free up

Speaker:

our time. And being able to do this will mean that we can remain

Speaker:

as solo businesses or we don't. It means that we don't need

Speaker:

to go out and try to get more and more people to do these really

Speaker:

laborious admin tasks. I want you to to go

Speaker:

away and think about that. The wiki stuff. There's going to be some

Speaker:

resources on our website. You can go and check out lonewolfunleash.com

Speaker:

resources. Go check that out. Thank you so much

Speaker:

for joining me today. I really appreciate your have been doing a million things but

Speaker:

you decided to hang out with me and figure out how to use AI

Speaker:

more efficiently this week alongside your knowledge,

Speaker:

resources and your wikis. Thank you so much and I'll see you next week.

Links

Chapters

Video

More from YouTube