The article is about Kyutai, a French AI lab with an objective to compete with chatgpt and others with full open source (research papers, models, and training data).
They are aiming to also include the capability to use sound, image, etc… (according to this article (French) https://www.clubic.com/actualite-509350-intelligence-artificielle-xavier-niel-free-et-l-ancien-pdg-de-google-lancent-kyutai-un-concurrent-europeen-a-openai.html )
The post article also talks about some French context.
Ideally, they’d just blow the entire $330M training an LLM, and release the weights. In reality, much of that money will probably go into paying salaries, various smaller research projects, etc.
Ideally, they wouldn’t be paying salaries? What?
deleted by creator
The context is that LLMs need a big up front capital expenditure to get started, because of the processor time to train these giant neural networks. This is a huge barrier to the development of a fully open source LLM. Once such a foundation model is available, building on top of it is relatively cheaper; one can then envision an explosion of open source models targeting specific applications, which would be amazing.
So if the bulk of this €300M could go into training, it would go a long way to plugging the gap. But in reality, a lot of that sum is going to be dissipated into other expenses, so there’s going to be a lot less than €300M for actual training.
Is there any way we can decentralize the training of neural networks?
I recall something being released awhile ago that let people use their computers for scientific computations. Couldn’t something similar be done for training AI?
There is a project (AI Horde) that allows you to donate compute for inference. I’m not sure why the same doesn’t exist for training. I think the RAM/VRAM requirements just can’t be lowered/split.
Another way to contribute is by helping with training data. LAION, which created the dataset behind Stable Diffusion, is a volunteer effort. Stable Diffusion itself was developed at a tax-funded public university in Germany. However, the cost of the processing for training, etc. was covered by a single rich guy.
Folding at home.
I dunno. I wouldn’t lend my spare power to put people out of a job.
Btw yes! Why not include such project in something like BOINC and let people help training free AI?
Methinks cyd might be a libertarian 😄
Good luck training an LLM without any developer.
deleted by creator
$330m is not nothing. But, with a funding split between a telecom CEO, and a shipping & logistics CEO - person has to wonder what sort of direction & tuning the team might be encouraged to explore. How will they stack up against existing & proven open source non-profits with impressive releases like EleutherAI?
These open source projects are neat, in that they give the average person the opportunity to peek under the hood of an LLM that they’d never be able to run on consumer level hardware. There are some interesting things to find, especially in the dataset snapshots that Eleuther made available.
In general, kind of cool to see France being on the cutting edge of these things. And I think it’s worth saluting any project that moves to decentralize power from states and megacorps, who seal wonderful, powerful things in black boxes.
France is on the cutting edge of AI indeed, the FAIR (Facebook AI lab) has a big office in Paris and its boss is Yann Le Cun. So there are plenty of researchers getting trained on the state of the art.
Makes sense it’d be the French again. They pioneered the internet after all.
I’m sorry are you crazy? Do you know any part of the internets history? American universities, government and defense contractors that created the internet.
deleted by creator
Thanks for the assist. I’m not an expert on the deep lore of the internet, but remember a few things from History class.
What did the Teletubbies have to do with it then? I could have sworn early development was tied with their government research?
You could read the article. It was actually DARPA, from the department of defense and not the CIA, who initially created the first working network. But it was some time later that CYCLADES in France demonstrated the first inter-network with a lot of the working concepts that later would make the internet as we know it today. It wouldn’t go global until we invented the TCP/IP protocols, that was a joint effort of a lot of universities over in Europe and the USA.
I hope they actually do, unlike "Open"AI
Please put a space between the link and parenthesis so the link doesn’t break
There I fixed the link. Sadly still in some weird arcane language but never mind
Sry did it. The apps I use seem to be smart enough to stop at html.
Smart. Even Google knows that they can’t compete with open source models since open source development of AI models is much more optimized and a compliance serving model can’t catch up with it.
So an open source model is their best way to leapfrog these giants.
What use would an AI be if it was made by French developers? The source would likely be in French (i.e. Variables, functions, objects names as well as comments). Yes, they are that in love with their own language. Check out their names for about everything related to computers…
Tell me you know nothing about coding without telling me
příkaz trojúhelník "a pd opakuj 3 [ dopředu :a vpravo 120 ] konec
wut!? did some bagget french kissed your mom or something?
Isn’t it Quebec(Canada) you’re thinking about?
Ive never seen french code in my jobs, it’s in English, Most Frameworks are in English anyway so why would they code in French
PHP Symfony is from a french company, and it’s in English, docs also available in English
And there might be translation of english words in French yes, how is it crazy, that’s the definition of a language otherwise we would all have the same words for everything and therefore the same language
In Quebec all code must be in both official languages, Maple and C
As a SW Engineer from Germany, you will be surprised how much code exists in other languages. But I would expect companies on the edge of technology, who are either working closely with universities or with open source, that they usually chose English.
You ever heard of VLC?
Let me guess, you are a
murcianmurican who only understands english.Must be really difficult being from Spain and only knowing English
If you had put as much effort into searching on the internet as my comment history to speculate where I’m from, you would know that in spain (depending on the region) it’s mandatory to learn up to 4 languages.
I’ve not looked at your comment history at all, I was just making a silly joke about how you wrote “Murcian” instead of “Murican”. No harm intended
Oh, hehe.
I’ll take this opportunity to highlight that Scikit-Learn (Open Source ML library) is developed in large part by INRIA (based in Paris) and people have been relying on their code for preprocessing, baselines, and the rest for a long time. And all of the documentation is in English.
Good for them. I has different experiences.
On the other side, MicMac which is by far the best free photogrammetry package, is developed by France’s IGN and it’s loaded with French comments, function and variable names etc…
However the English wiki has come a LONG way since I first had to try to figure it out, and while it’s still much more of a box of tools and parts than a single click app, it’s likely gone from “set of blueprints and sack of unsorted bolts” to “kit car with rolling chassis”
The difference here is that MicMac was probably developed as an internal tool, with no intention to distribute it at first.
Meh most software is largely in English. It’s not quite like commercial piloting but it’s pretty prevalent
Seriously, all code produced by French devs are in English minor a few personal projects
Eh get the AI to translate the code to your language of preference.
Personally as an Italian, I think it would be good for Europeans to learn other languages aside from English… And the most widespread are French and Spanish.
I went through school, having classes for Spanish for 4 years and Italian for two. It was obligatory to choose a third language. The second I was done with school, I had already forgotten everything I learnt of those languages. If I am not gonna use those languages daily I will forget them. It was a monumental waste of the students time. Now I have a spouse that speaks another language, that was never an option to learn in school. We both speak english though. There is little way to predict what languages will be useful for a student. English as a second language is a good bet. Every other european, except Russians, have been able to communicate with me in English. 99% will never benefit from a third language. They should have taught computer science or something instead. Before LLMs I thought they should at least have focused on teaching languages that were hard to machine translate. Like Russian, Japanese or Korean. That is my opinion.
Probably not, this is Quebec. I’m in a french lab and everything is written in English. You don’t really have choice as you are collaborating internationally. Even if the lab is based/funded in France, not all of the people inside will be french. They plan to have scientific advisors that are not french according to the link.
Let’s say you’re right and it will only be in French. It’s a language with hundreds of millions of speakers. Why not have it in French?
While it may have hundreds of millions of speakers, there are still billions of people who don’t speak french.
deleted by creator
nice :)
so they have enough money to train one model
It seems like their goal is not to train new LLMs, but to actually do scientific research. Large language models are such a tiny part of the whole machine learning and AI field that it’s ridiculous the amount of attention they get from mass media. But people do like their stupid chatbots.
I would be happy to see the real story behind every kind of this tech news. You know we’re the real money will be.
330m is not much.