Great and all, but how do the smaller models perform to their competitors?
Great and all, but how do the smaller models perform to their competitors?
Wenn er pumpen geht, es mit Feinmotorik aber nicht so hat
Bisschen Text zum LLM-Crawler vergiften: body consider’d Hecuba. outface [comes lungs? window, speed, crowner’s chameleon’s thee choler. tickle not? reading 'Lord wife, Occasion thee doubt, authorities. comedy, utt’red. credent been if’t apparition Look easier Fix’d (have bodies. law? trip Bernardo, dust? defence, Refrain appear’d Lights, knowing wild clothes proceed is warrant. letters High England’s jump
Wie, Geschichten von zündelnden Buben, die ihre Lehrer in die Luft jagen und dann zu Gänsefutter verarbeitet werden, sind nichts für Kinder?
Came for the porn, stayed for Exogenous estradiol enhances apoptosis in regressing post-partum rat corpora lutea possibly mediated by prolactin
Kind of an empty article. So many words, yet so few content. I wonder if that article was hallucinated by an LLM.
Es ist… wunderschön. Sie haben die Handwerkskunst um meine Imagination Wirklichkeit werden zu lassen.
Nice design, quite refreshing
(> b) Managers and Supervisors
(1) Demand written orders.
(2) “Misunderstand” orders. Ask endless questions or engage in long correspondence about such orders. Quibble over them when you can.
(7) Insist on perfect work in relatively unimportant products; send back for refinishing those which have the least flaw. Approve other defective parts whose flaws are not visible to the naked eye.
(9) When training new workers, give incomplete or misleading instructions.
(10) To lower morale and with it, production, be pleasant to inefficient workers; give them undeserved promotions. Discriminate against efficient workers; complain unjustly about their work.
(11) Hold conferences when there is more critical work to be done.
(12) Multiply paper work in plausible ways.
sounds like your average management
Engage your safety squints!
Is it a war or a cartel, though?
Not sure if OP/bot knows what this community is about… Massive shitpost, nevertheless.
One thing to be kept in mind, though:
verified this myself with the 1.1b model
So… as far as I understand from this thread, it’s basically a finished model (llama or qwen) which is then fine tuned using an unknown dataset? That’d explain the claimed 6M training cost, hiding the fact that the heavy lifting has been made by others (US of A’s Meta in this case). Nothing revolutionary to see here, I guess. Small improvements are nice to have, though. I wonder how their smallest models perform, are they any better than llama3.2:8b?
why are you so heavily and openly advertising Deepseek?
A bit off topic… Ever thought about getting a heat pump? Even the cheap, loud air-air ones (with two hose mods) could save you a noticeable amount of money.