A proposal for inline LLM instructions in HTML based on llms.txt

6 points by BryceWray


jak2k

Oh, nice! If this is used widely, I can easily prompt inject any AI maleware that doesn’t respect my robots.txt.