Editor's Note: In this insightful commentary, Bill Gropp, Director of the National Center for Supercomputing Applications (NCSA), explores the unique trajectories of AI research within academic and commercial realms. Gropp delves into the differing motivations and methodologies of these sectors, highlighting how academia prioritizes innovation and understanding, while the commercial sector drives rapid, scalable applications. This piece serves as a reminder of the collaborative potential between these spheres in advancing AI technology responsibly and effectively.
www.hpcwire.com/, Nov. 06, 2024 –
Around this same time last year, I expounded on what the "Future of AI" may entail. A lot has happened in the 12 months since then, including new approaches, new trends and, yes, new complications.
A lot of the news covering artificial intelligence stems from the efforts being made in the commercial sector. Whether it's well-known chatbots and large language models like ChatGPT or legacy tech companies like Google and Microsoft investing heavily in AI development, the public's awareness of this field of industry continues to grow.
But what about academic research into AI?
Research is absolutely a major part of AI in the commercial sector. But in the end, commercial companies are looking to make money and they're looking to make money before somebody else does. If you think of AI as revolutionary, you want to be the one that succeeds in delivering the revolution because that's where you'll get the most return. That's why I think we see this huge commercial investment in AI systems that apply the existing algorithms or modest improvements of them combined with more and more GPU and computing resources to deliver AI capabilities you can put into products.
On the academic research side, there's a lot of interest in understanding the strengths and weaknesses of AI systems – "explainable AI." There's great interest in figuring out how we can build AI systems that don't produce false outputs. NCSA's partnership with the National Deep Inference Fabric provides an environment in which researchers can better understand how inference happens and how a trained model comes up with a certain kind of response.
There's also substantial interest in adding local updates to a trained model. How do you increase the speed at which you can do that while reducing the amount of computing and energy it takes? There's a huge amount of research that needs to be done that doesn't require the enormous scale that you find in the commercial sectors. This is one of the goals of the National Artificial Intelligence Research Resource (NAIRR), which aims to democratize access to computing resources for those who do research on AI in the many areas that don't require tens or hundreds of thousands of GPUs.
Different Goals, Similar Outcomes
Of course, there are different sets of guidelines, goals, and rewards within the academic and commercial communities. Both are effective in their own way and can be beneficial to each other. The academic system encourages people to explore new ideas and increase the body of knowledge. Research is trying to answer questions, discover new knowledge, gain insight and understanding, and make it available to everyone by publishing or releasing artifacts others can use. The commercial sector – capitalism – encourages people to think about how to deliver value measured as profit but one that can benefit society as well.