Explainer
What is llms.txt — and does your website need one?
llms.txt is a plain-text file you host at the root of your domain — the same level as robots.txt. It gives large language models a short, human-written map of who you are, which URLs matter most, and which facts you want emphasised when crawlers summarise your site.
Think of it as a press kit for machines: not a replacement for good HTML or schema, but a fast way to reduce ambiguity when bots encounter dozens of templates, tracking parameters, and legacy pages.
Why llms.txt was created
Foundation models ingest billions of URLs. Without guidance, crawlers must infer importance from link graphs alone — fine for Wikipedia, noisy for a five-page plumber site with a decade of blog spam. llms.txt lets the site owner say: “Start here; ignore these paths; here are the facts I stand behind.”
The proposal circulated in 2024; adoption is still uneven, which is exactly why early movers get credit in GEO audits.
How llms.txt works
Canonical location:
https://yourdomain.com/llms.txt
Most files use a light Markdown structure:
# Your Business Name
> One sentence: what you do, where you are, who you serve.
## Key Pages
- [Homepage](https://yourdomain.com/): Primary offer
- [About](https://yourdomain.com/about): Team and story
- [Contact](https://yourdomain.com/contact): Address, hours, phone
## Key Facts
- Founded: 2020
- Location: Liverpool, UK
- Specialisation: Pakistani and Kashmiri cuisine
There is no rigid ISO standard yet — clarity beats cleverness. Update the file when your services or locations change.
Does your website need one?
Strictly speaking, no file is legally mandatory. Practically, yes — if you care about AI visibility:
- It helps crawlers orient quickly on small sites.
- It gives you a controlled blurb some tools surface verbatim.
- It takes about ten minutes once you know your facts.
- Visus treats llms.txt as one of many signals; sites with it tend to score higher on corroboration-adjacent checks.
Visus checks llms.txt as one of roughly 29 AI visibility signals. Publishing a accurate file is a fast way to lift your technical story — especially alongside schema and NAP consistency.
Who uses llms.txt?
Early adopters skew technical — SaaS, devtools, agencies baking GEO into launch checklists. Local businesses are catching up as owners realise ChatGPT and Perplexity send foot traffic the same way Google once did.
How to create your llms.txt in ten minutes
- Open any text editor.
- Write a heading with your legal or trading name.
- Add a one-line description with city and category.
- List 3–6 URLs that actually matter — not every tag archive.
- Add a Key facts block: address, reg number if relevant, email, hours.
- Save as
llms.txt(UTF-8, no BOM surprises). - Upload to your web root via FTP, Git, or your host’s file manager.
- Visit
https://yourdomain.com/llms.txtin an incognito window to verify.
Template skeleton:
# [Your Business Name]
> [One sentence: what you do, where you are, who you serve]
## About
[2–3 factual sentences]
## Key Pages
- [Home](https://yourdomain.com/): [description]
- [About](https://yourdomain.com/about): [description]
- [Contact](https://yourdomain.com/contact): [address, phone, hours]
## Key Facts
- Location: [full address]
- Founded: [year]
- Specialisation: [niche]
- Contact: [email]
llms.txt vs robots.txt
robots.txt tells automated agents which paths they should or should not fetch. It is a gate.
llms.txt is a guide — it does not enforce permissions on its own. You need both: robots policy for compliance, llms.txt for orientation.
Checking if your site has llms.txt
Load /llms.txt directly. 200 + plain text means you are serving something. 404 means you still need to publish (or your host is blocking the path).
Visus automatically checks llms.txt during the free audit alongside schema, crawler access, and content blocks — so you get a pass/fail without manual curl commands.
Check if your site has llms.txt
Run a free Visus audit — llms.txt plus dozens of other signals in about a minute.
Run free audit