Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And what's wrong with not wanting to write functions yourself? It is a perfectly reasonable thing, and in some cases (ex: crypto), rolling your own is strongly discouraged. That's the reason why libraries exist, you don't want to implement your own associative array every time your work needs it do you?

As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.

Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.



If I use a package for crypto stuff, it will generally be listed as part of the project, in an include or similar, so you can see who actually wrote the code. If you get an LLM to create it, it will write some "new original code" for you, with no ability to tell you any of the names of people who's code went into that, and who did not give their consent for it to be mangled into the algorithm.

If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website


Why is "plagiarism" "bad"?

Modern society seems to assume any work by a person is due to that person alone, and credits that person only. But we know that is not the case. Any work by an author is the culmination of a series of contributions, perhaps not to the work directly, but often to the author, giving them the proper background and environment to do the work. The author is simply one that built upon the aggregate knowledge in the world and added a small bit of their own ideas.

I think it is bad taste to pass another's work as your own, and I believe people should be economically compensated for creating art and generating ideas, but I do not believe people are entitled to claim any "ownership" of ideas. IMHO, it is grossly egoistic.


Sure, you can't claim ownership of ideas, but if you verbatim repeat other people's content as if it is your own, and are unable to attribute it to its original creator, is that not a bit shitty? That's what LLMs are doing


If a human learns to code by reading other people's code, and then writes their own new code, should they have to attribute all the code they ever read?

Plagiarism is a concept from academia because in academia you rise through the ranks by publishing papers and getting citations. Using someone else's work but not citing them breaks that system.

The real world doesn't work like that: your value to the world is how much you improve it. It would not help the world if everyone were forced to account for all the shoulders they have stood on like academics do. Rather, it's sufficient to merely attribute your most substantial influences and leave it at that.


If a human copies someone else's code verbatim, they should attribute the source, yes. If they learn from it and write original code, no, they don't have to cite every single piece of code they've ever read


Yes, you've stated the current social and legal rule we have to follow.

But I don't think you've given any moral justification for the rule, and in particular, why LLMs (who are not humans and have no legal rights or obligations) have to follow it.


But some company owns the LLM, and they have legal rights and obligations. You don't get to use AI to launder breaking the law.


Is "taking credit for something someone else did is not very nice" not enough moral justification for you?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: