I think this is terrible, but I suppose I think it is less bad if ChatGPT didn’t get the phone number from the bank’s files/data.
If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad?
Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case.
parisidau 179 days ago [-]
I would argue the source of the information doesn’t matter. The bank disclosed a phone number of a customer to another customer.
dontdoxxme 179 days ago [-]
You sound like a crazy person. The actual relevant bit of law is APP 13[1], which has a definition for "holds", which shows the source of the information does matter. If you're going to go and quote half the law, but ignore the actual relevant piece, at least talk to a lawyer.
Pardon? Why do you think I sound like a crazy person? Go on, please do.
Also APP 13 is tangentially relevant, for sure, but not the crux of it.
APP 13 is far less relevant than, for example, APP 11.
femto 179 days ago [-]
Good luck getting any recompense. CBA disclosed your phone number. I'm aware of a company that disclosed 3000 high resolution colour passport scans, along with all personal details from a travel booking website. About half of the records were for school children. No one was notified that their data was leaked. Diddly squat happened to that company.
legacynl 179 days ago [-]
Wow this is terrible
Rendered at 12:40:29 GMT+0000 (Coordinated Universal Time) with Vercel.
If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad?
Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case.
[1]: https://www.oaic.gov.au/privacy/australian-privacy-principle...
Also APP 13 is tangentially relevant, for sure, but not the crux of it.
APP 13 is far less relevant than, for example, APP 11.