- 39,985
- 289
I went from using Google to search things 100% of the time, to 10% of the time in a year. I have ChatGPT and Claude open on my desktop all day, everyday. Things like troubleshooting maintenance issues are far easier, and far more productive using AI than with Google. It's just one of many reasons why I'm bullish on AI long-time as a beneficiary.
With AI being a learning tool it's only as good as the information it can access and summarize today, who / what continues to feed the beast going forward? In short what incentive do people (Humans, Companies, Universities) have to produce information, data, content etc for AI to scrape so Claude, ChatGPT, Copilot can rip them off?. It's great at what it does, today, but the information is only as good as what's known and published today. There comes a point where the information becomes dated, and even incorrect.
I'm already seeing contracts with provisions around AI modules. Clauses and questions such as "is our data used to develop AI modules" and "All AI modules, agents, code developed through human interaction or learning utilizing XYZ data shall remain sole property of XYZ company."
Research papers, scientific journals, and disease studies, are all slamming the door shut on publishing data and or forbidding AI access, and claiming IP of any "product" developed from their data.
It'll be an interesting frontier over the next decade. If I was in lawschool at the moment I would be switching my focus to data privacy and intillectual.property law in a hurry.