Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I reckon even a Markov chain model of similar magnitude could wow us in various ways.

I bet $100 it cannot.



It’s completely subjective, so it’s hard to have such a bet without further specification.

But I definitely feel like a sufficiently large Markov model could output sentences in response to input that would amaze/freak out at least some users. It would just be parroting its training data pretty much verbatim (at the level of short n-grams, at least), but due to the sheer amount of the data the exact source would be obscure and hence more easily attributed to the model ‘thinking’ for itself and being creative.

That’s all I was in fact saying — not that it would wow all of us all of the time.

Blake Lemoine was fired from Google for (/in relation to his) expressing that a language model was sentient. We all have different thresholds for thinking ‘oh my god this thing has to be conscious!’.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: