Art and controversy go together, and that's been particularly true this month. British street artist Banksy sold a painting at auction that started to self-destruct with a built-in shredder. And last week, a piece of AI-produced art sold at Christie's auction house for $432,500, or more than 40 times the initial estimate of $7,000 to $10,000.

However, even as Christie's asks if "artificial intelligence set to become art's next medium," a new controversy has arisen. Not that AI software created art. That's been around for a while. But 19-year-old Robbie Barrat says he wrote coded used by the three French art student who call themselves Obvious to make the painting -- and cash the check without sending some Barrat's way.

According to The Verge, which broke the story about Barrat's role, the members of Obvious don't dispute his involvement. But, as the story noted, "they didn't publicize that fact either."

The technology starts with Ian Goodfellow, now a Google researcher, who developed the idea of having two separate AI programs work to develop something. One system goes through hundreds, or thousands or even more, examples of a finished product (paintings in this case), trying to learn the patterns and make something compatible.

The second system acts as a judge, deciding whether it can tell the difference between the first system's work and the finished materials used in the learning process. If the judge, called a discriminator, can, it sends the work back and the first AI system tries again.

Now Barrat's work comes into play. He has done a lot of AI-generated art using Goodfellow's basic idea and has freely shared his code and helped other artists to get started. Including with Obvious.

Barrat's beef is that Obvious didn't make much in the way of changes and created something that looked like what came out of his shared code, as he told the Washington Post.

"I was really expecting people to use [the code] as components for their own project. But I never thought anybody would sell it, just because it's not high-quality work," Barrat, now 19 and working at a Stanford University AI research lab, told The Washington Post. "It was a project I did in my free time when I was 17."

German-based artist Mario Klingemann told both the Washington Post and the Verge that the work was mostly Barrat's -- 90 percent, he told the latter -- and that anyone could download a copy of the code and start working.

Depending on the license granted on GitHub, a code sharing site where Barrat posted his work for others to use, Obvious may have the rights to do whatever it wanted with the code. Or perhaps not. They won't again, because Barrat added some explicit lines about this:

When using any outputs of the models, credit me. Don't sell the outputs of the pre-trained models, modified or not. If you have any questions email me before doing anything.

That gets to two basic problems of innovation. The process typically requires a combination of ideas or concepts that hadn't been put together in the same way before. Are you or your team giving proper credit where it's due? Not doing so can be the ethnical equivalent of plagiarism, where you pretend to have created something that was the work of another.

Then there are the potential legal issues. Do you have explicit permission to make commercial use of what someone else did? Code is copyrighted. You need a license to make use of it and include it in your own project. But many developers, whether working on art projects or on something for an employer, don't carefully check the limitations on use.

More power to those who innovate and help move progress along. But be sure you're doing so ethically ... and legally. Facing a lawsuit over assuming someone will forgive you before they permit you is not a great prospect.