Meta & Microsoft CEOs discuss AI overtaking software creation
At LlamaCon 2025, Meta's first-ever AI developer conference, CEO Mark Zuckerberg and Microsoft Chairman & CEO Satya Nadella discussed on Tuesday the future of AI development, emphasizing the increasing role of AI in software creation.
Nadella noted that 20-30% of Microsoft's code is now generated by AI, while Zuckerberg predicted that AI will handle half of Meta's software development within a year. Both leaders highlighted the transformative potential of model distillation, a process that reduces large AI models into smaller, more efficient forms for broader accessibility.
Meta Platforms on Tuesday announced an application programming interface in a bid to woo businesses to more easily build AI products using its Llama artificial-intelligence models.
META / REUTERS
Subscribe to The Manila Times Channel - https://tmt.ph/YTSubscribe
Visit our website at https://www.manilatimes.net
Follow us:
Facebook - https://tmt.ph/facebook
Instagram - https://tmt.ph/instagram
Twitter - https://tmt.ph/twitter
DailyMotion - https://tmt.ph/dailymotion
Subscribe to our Digital Edition - https://tmt.ph/digital
Check out our Podcasts:
Spotify - https://tmt.ph/spotify
Apple Podcasts - https://tmt.ph/applepodcasts
Amazon Music - https://tmt.ph/amazonmusic
Deezer: https://tmt.ph/deezer
Tune In: https://tmt.ph/tunein
#TheManilaTimes
#tmtnews
#meta
#microsoft
#artificialintelligence
At LlamaCon 2025, Meta's first-ever AI developer conference, CEO Mark Zuckerberg and Microsoft Chairman & CEO Satya Nadella discussed on Tuesday the future of AI development, emphasizing the increasing role of AI in software creation.
Nadella noted that 20-30% of Microsoft's code is now generated by AI, while Zuckerberg predicted that AI will handle half of Meta's software development within a year. Both leaders highlighted the transformative potential of model distillation, a process that reduces large AI models into smaller, more efficient forms for broader accessibility.
Meta Platforms on Tuesday announced an application programming interface in a bid to woo businesses to more easily build AI products using its Llama artificial-intelligence models.
META / REUTERS
Subscribe to The Manila Times Channel - https://tmt.ph/YTSubscribe
Visit our website at https://www.manilatimes.net
Follow us:
Facebook - https://tmt.ph/facebook
Instagram - https://tmt.ph/instagram
Twitter - https://tmt.ph/twitter
DailyMotion - https://tmt.ph/dailymotion
Subscribe to our Digital Edition - https://tmt.ph/digital
Check out our Podcasts:
Spotify - https://tmt.ph/spotify
Apple Podcasts - https://tmt.ph/applepodcasts
Amazon Music - https://tmt.ph/amazonmusic
Deezer: https://tmt.ph/deezer
Tune In: https://tmt.ph/tunein
#TheManilaTimes
#tmtnews
#meta
#microsoft
#artificialintelligence
Category
🗞
NewsTranscript
00:00And so I'd say maybe 20-30% of the code that is inside of our repos today in some of our
00:11projects are probably all written by software. What about you guys?
00:16The big one that we're focused on is building an AI and a machine learning engineer to advance
00:24the Lama development itself, right? Because I mean, our bet is sort of that in the next year,
00:30probably, you know, I don't know, maybe half the development is going to be done by AI as opposed
00:37to people, and then that will just kind of increase from there.
00:45That to me is, I think, one of the biggest roles of open source, right, which is to be able to take,
00:50let's say, some of your, even inside of the Lama family, taking a large model and then to be able
00:58to distill it into a smaller model that has even that same model shape, I think it's a big use case.
01:06I'm not dogmatic about closed source or open source.
01:10I've always been fascinated by this. I mean, I think the distillation is one of the most powerful parts
01:14of open source. And I think just because of getting it, I mean, the distillation, it's just like,
01:19it's magic. I mean, you basically can make it so that you can get 90% or 95% of the intelligence
01:25of something that is 20 times larger in a form factor that is so much cheaper and more efficient
01:31to use. So then the question is, we built this for kind of server production, but a lot of the
01:37open source community wants even smaller models. But being able to basically take whatever intelligence
01:43you have from bigger models and distill them into whatever form factor you want to be able to run
01:47on your laptop, on your phone, on whatever, whatever the thing is, I think is just, I don't know.
01:52I mean, to me, this is like one of the most important things.
01:54Yeah.
01:55So I think so.