New study shows smart chatbots can figure out who you really are from just a few posts… and it only costs a couple of dollars.
Nothing of substance here. Stylometric analysis was already a thing. Easy enough to defeat and with correct opsec you can avoid it. Burn accounts regularly. Use accounts for specific topics or share accounts with others. Dont post personal details online , on the internet everyone is a cat.
I certainly am!
deleted by creator
I don’t like having to be vague about my age, nationality, job etc, because I’d rather be honest and relate to others online, but sadly it’s a necessity in the modern landscape
That’s why you have multiple accounts. Some that are for above table things which can have your personal details and others for eating the rich
One account that can be correlated to place/city, willing to discuss local news and issues.
One account that can be correlated to family status, willing to mention details about relationships.
One account that can be correlated to career, willing to mention details about educational background, industry news, the job market, the workplace, etc.
One account that can be correlated to each distinct hobby or interest. Some interests can correlate among themselves (like an all sports account that discusses multiple sports) and are safe to discuss on a single account. Like my current account that is tech oriented, including some stuff about games or Linux or networking or even the tech industry. But keep the different interests on separate accounts.
Then different accounts for topics that you consider controversial or private.
And, preferably, spread all those accounts across multiple instances so that instance admins can’t link accounts from metadata (client, OS, IP address, email verification), use completely unique usernames, and avoid unique markers like esoteric phrases, unique autocorrect errors, etc.
Even if an adversary can link two accounts, they probably can’t link all of them.
Looks like the LLM can be used to cross reference data from your pseudo private account to your public account. What a surprise
It’s a good thing that I work for Dick’s Fish & Chips located on the main street of a bustling city in Antarctica. I wouldn’t want the LLM to get it wrong.
The same Dick’s Fish & Chips where in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer’s table?
That Dick’s Fish & Chips?
Yup. The only difference between this and what any individual could already do is just time and scale.
Data brokers and government surveillance organizations have already had specialized tools to do this sort of thing for a while now, it’s just that LLMs reduce the complexity and specialization needed to actually make an implementation that works well as an individual person.
and if they are wrong they still get the couple of dollars so win win. I can unmask anyone you want online from just a few posts and a name randomizer.
deleted by creator
Based on the research they had a 60 something % accuracy. But the test data was for HackerNews accounts which linked to LinkedIn. I would guess that anyone linking their anonymous account to their LinkedIn profile isn’t really trying to hide themselves.
Oh no, who’s making them use all those terrible services that whore them out like that?




