OPERATING SYSTEMSOS Linux

NetBSD Completely Bans AI Generated Code

Whether you like the direction that AI is going or not you have to decide how you’re going to use or not use it, it’s no longer enough …

source

by Brodie Robertson

linux foundation

24 thoughts on “NetBSD Completely Bans AI Generated Code

  • Maybe AI could be useful for generating Linux documentation?!

  • Good policy by NetBSD.
    Although the "AI" companies got their systems working by scouring the net and copying other devs code, at some point in the future they will claim they own the code. They will want royalties because you used co-pilot or whatever.

  • Sounds like a very prudent policy.
    "It would be hard to detect."
    "Sure, but not impossible. We just don't know. So let's have a policy that will save us the chance of a mountain of lawyers, grief, and paperwork."

  • You remember what Monsanto did? They had GMO plants cross pollinate with neighboring farms and then went after said farms for patent infringement when the resulting seed grew plants that contained some of the patented genes. Imagine if tainted code made its way into open source software.

  • Speaking of AI music, some guy just got AI to sing the Linux kernel source code and it slaps 😀

  • Brazilian elections has very explicit guidelines about marketing with AI. It has a explicit ban against using it to mimic voice, faces or anything of the sort, and if AI is used, it must be declared on the picture or whatever.

  • honestly it's fine if microsoft steals and de-licenses code using LLVMs because it saves people a few minutes here and there so whats the big deal

  • For me Chatgpt is great for little arduino projects and can also be helpful for finding errors

  • Yes, let’s just discourage people donating time and effort to a project from trying to find ways to be more productive. Sorry I meant to just say, yes, good job banning AI.

  • Basically, unless you're using your own AI engine trained on code you already own and maybe code that is already appropriately licensed, and nothing else, you can't be sure that any code generated won't be tainted by incompatible copyright, and therefore NetBSD can not accept the code in order to prevent potential legal repercussions. I'm sure some will still find its way in, but yeah, AI is going to be hell for OpenSource.

  • Doesn't "such as" specifically declare the items following as non-limited, aka only examples, by default? Non-english speaker, I am not used to the juridical language of English or the US.

  • Honestly, I don't get it. One necessary component for copyright is human authorship (cf. the case of the monkey photograph). Code written by "AI" is therefore never copyrighted. Or do they presume that it counts as remix of copyrighted code?

  • Honestly, I never know which code has tainted licence – someone copied five lines from stackoverflow and changed coding style and renamed two varibles from four. Is it fine? Someone was inspired by a function on github – is it fine? Someone wrote function according to paper, fixed perhaps intended "copy protection" critical bugs, did not checked if algorithm is patented (including checking patents in Arabic, Japananse and Navaho) … is it fine? Technically you almost cannot code if you care about this (and it's good reason why not to open source anything).
    And GPL is like if you see int a=0; and copy it, you should release your code under GPL.

  • Choose any commercial AI like ChatGPT, Copliot, etc, and somewhere in the fine print (or sometimes not so fine print) you will find words to the effect of this shit makes mistakes, use at your own risk. You have to cover your arse.

  • NetBSD showing leadership and caring about the BSD licence. The thing is that all of the training data was never probably licensed by the AI model maker. Maybe BSD licensee could include something in their licensee about training AI models on their code. I'd imagine they'd expect a notice the code generated contains training data from the NetBSD project. But then GPL would have a different set of requirements and off we go. Then again maybe it is just like coding in the real world where we learn to code little bits from everywhere and therefore AI can do the same. Will depend how heavily the model relies on one source. What a time to be alive.

  • Unrelated to NetBSD, but.. I don't agree with the pearl clutching paranoia around LLMs generating code. I don't see it as much different from a compiler. Like, how much machine code have you written recently? The compiler takes your intention written using human readable language and converted into a program that can actually be used.

  • there is no issue with copy right and AI generated ANYTHING!! this whole idea is bassed of a fundamental misrepresentation and misunderstanding of copy right laws. IT DOES NOT MATTER WHAT DATA THE MODEL IS TRAINED ON UNLESS IT REPRODUCES A WORK VERBATIM! IT NEVER HAS BEEN AND NEVER WILL BE AN INFRINGEMENT ON COPY RIGHT TO USE SOMEONE ELSE'S WORK AS INSPIRATION OR AS A STARTING POINT OR TO LEARN FROM ECT…… as for the quality of the code it produces I cant speak to because I cant even write a basic bash script. however I can tell you that right now is the worst it will ever be

  • Legal regulations are usually answers to problems that have already arisen. The question who owns AI-generated content is a new problem, no wonder it is a grey area. It is possible that for NetBSD foundation this is also a problem that has already appeared, e.g. a member might have needed to decide what to do with such code.

Comments are closed.