© 2024 WNIJ and WNIU
Northern Public Radio
801 N 1st St.
DeKalb, IL 60115
815-753-9000
Northern Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Perspective: Fair use and AI

Unsplash

The New York Times recently filed a lawsuit against Microsoft and OpenAI — the company that makes ChatGPT.

According to the lawsuit, the AI models both steal the work of the news giant while also sharing false information about stories it produced.

 

The misinformation comes from AI “hallucinations," which is when AI makes up information. Attributing incorrect information to the Times hurts its credibility.

 

OpenAI and Microsoft claim that training their models on the Times’s work falls under “fair use” laws that regulate the use of copyrighted works.

 

The lawsuit claims billions of dollars of damages and is similar to other legal suits brought against tech giants by authors whose work also trained AI models.

 

The ethical questions here are big. Doesn't every writer learn and borrow style from other writers? But shouldn't writers and organizations be fairly compensated? Yes and yes.

 

I have written articles on AI ethics for Reynolds Journalism Institute, SUCCESS Magazine, and the now defunct Habtic Standard. I am still torn.

 

In Joy Buolamwini’s new book, Unmasking AI, she talks about the shift from wanting more diverse training data to concern for informed consent and potential misuse of biometric data for facial recognition. But writing isn't biometric data and artists take inspiration from each other all the time.

 

I haven't decided how I feel yet, but I do believe that writers should be paid — especially when it relates to technology that attempts to replace them.

 

I'm Nia Norris and this is my perspective.

 

Originally from Pittsburgh, Nia Springer-Norris moved to DeKalb in 2021 to pursue a Master of Arts in Communication Studies with an emphasis on Journalism Studies. Nia is also a freelance journalist, editor, and communication consultant.