daredevil

joined 1 year ago
[–] [email protected] 2 points 9 months ago (1 children)

defaming them without due diligence, think about that before continuing

The irony here is unbelievable rofl you can't make this up. My previous statement was calling you childish and desperate for attention. Thanks for reminding me of that fact, so I can stop wasting my time. It is very clear you're not interested in a genuine and constructive conversation.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago) (3 children)

It's not one week of inactivity, is has been going on for months

Looks at 2 months straight of kbin devlogs since October, when the man was having pretty significant personal issues

Not to mention he was: recently sick; tended to financial issues, and personal matters; formalities relating to the project. This isn't even mentioning that he communicated this in the devlog magazine. Or the fact that he has implemented suggestions multiple times at the request of the community to enhance QoL, and allowed users to have agency in making mod contributions.

You might want to take your own advice. This has also allowed me to revise my earlier statement. You people are actually insane.

[–] [email protected] 4 points 9 months ago (1 children)

every post I see from them further paints them as very childish and desperate for attention.

[–] [email protected] 4 points 9 months ago

Agreed, every post I see from them further paints them as very childish and desperate for attention.

[–] [email protected] 2 points 9 months ago

Came here to post because I've also seen The Symphony of the Goddess live. The poster for it is behind me at the moment. Great experience.

[–] [email protected] 2 points 9 months ago

I've only felt the need to change distros once, from Linux Mint to EndeavourOS, because I wanted Wayland support. I realize there were ways to get Wayland working on Mint in the past, but I've already made the switch and have already gotten used to my current setup. I personally don't feel like I'm missing out by sticking to one distro, tbh. If you're enjoying Mint, I'd suggest to stick with it, unless another distro fulfills a specific need you can't get on Mint.

 
 

Terminal Trove showcases the best of the terminal, Discover a collection of CLI, TUI, and more developer tools at Terminal Trove.

 

イニシエノウタ/デボル · SQUARE ENIX MUSIC · 岡部 啓一 · MONACA

NieR Gestalt & NieR Replicant Original Soundtrack

Released on: 2010-04-21

[–] [email protected] 2 points 10 months ago

Came here with this show in mind. Would recommend.

[–] [email protected] 1 points 11 months ago

I haven't, but I'll keep this in mind for the future -- thanks.

[–] [email protected] 1 points 11 months ago (2 children)

I believe I was when I tried it before, but it's possible I may have misconfigured things

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

I'll give it a shot later today, thanks

edit: Tried out mistral-7b-instruct-v0.1.Q4_K_M.ggufvia the LM Studio app. it runs smoother than I expected -- I get about 7-8 tokens/sec. I'll definitely be playing around with this some more later.

 

On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej Karpathy and Jim Fan. That means we're closer to having a ChatGPT-3.5-level AI assistant that can run freely and locally on our devices, given the right implementation.

Mistral, based in Paris and founded by Arthur Mensch, Guillaume Lample, and Timothée Lacroix, has seen a rapid rise in the AI space recently. It has been quickly raising venture capital to become a sort of French anti-OpenAI, championing smaller models with eye-catching performance. Most notably, Mistral's models run locally with open weights that can be downloaded and used with fewer restrictions than closed AI models from OpenAI, Anthropic, or Google. (In this context "weights" are the computer files that represent a trained neural network.)

Mixtral 8x7B can process a 32K token context window and works in French, German, Spanish, Italian, and English. It works much like ChatGPT in that it can assist with compositional tasks, analyze data, troubleshoot software, and write programs. Mistral claims that it outperforms Meta's much larger LLaMA 2 70B (70 billion parameter) large language model and that it matches or exceeds OpenAI's GPT-3.5 on certain benchmarks, as seen in the chart below.
A chart of Mixtral 8x7B performance vs. LLaMA 2 70B and GPT-3.5, provided by Mistral.

The speed at which open-weights AI models have caught up with OpenAI's top offering a year ago has taken many by surprise. Pietro Schirano, the founder of EverArt, wrote on X, "Just incredible. I am running Mistral 8x7B instruct at 27 tokens per second, completely locally thanks to @LMStudioAI. A model that scores better than GPT-3.5, locally. Imagine where we will be 1 year from now."

LexicaArt founder Sharif Shameem tweeted, "The Mixtral MoE model genuinely feels like an inflection point — a true GPT-3.5 level model that can run at 30 tokens/sec on an M1. Imagine all the products now possible when inference is 100% free and your data stays on your device." To which Andrej Karpathy replied, "Agree. It feels like the capability / reasoning power has made major strides, lagging behind is more the UI/UX of the whole thing, maybe some tool use finetuning, maybe some RAG databases, etc."

Mixture of experts

So what does mixture of experts mean? As this excellent Hugging Face guide explains, it refers to a machine-learning model architecture where a gate network routes input data to different specialized neural network components, known as "experts," for processing. The advantage of this is that it enables more efficient and scalable model training and inference, as only a subset of experts are activated for each input, reducing the computational load compared to monolithic models with equivalent parameter counts.

In layperson's terms, a MoE is like having a team of specialized workers (the "experts") in a factory, where a smart system (the "gate network") decides which worker is best suited to handle each specific task. This setup makes the whole process more efficient and faster, as each task is done by an expert in that area, and not every worker needs to be involved in every task, unlike in a traditional factory where every worker might have to do a bit of everything.

OpenAI has been rumored to use a MoE system with GPT-4, accounting for some of its performance. In the case of Mixtral 8x7B, the name implies that the model is a mixture of eight 7 billion-parameter neural networks, but as Karpathy pointed out in a tweet, the name is slightly misleading because, "it is not all 7B params that are being 8x'd, only the FeedForward blocks in the Transformer are 8x'd, everything else stays the same. Hence also why total number of params is not 56B but only 46.7B."

Mixtral is not the first "open" mixture of experts model, but it is notable for its relatively small size in parameter count and performance. It's out now, available on Hugging Face and BitTorrent under the Apache 2.0 license. People have been running it locally using an app called LM Studio. Also, Mistral began offering beta access to an API for three levels of Mistral models on Monday.

16
submitted 11 months ago* (last edited 11 months ago) by [email protected] to c/[email protected]
 

Title: Let the Battles Begin!
Name: Final Fantasy VII
Year Released: 1997
Composer: Nobuo Uematsu
Developer: Square Enix
Platform: PlayStation

 

Title: Green Hill Zone
Game Name: Sonic the Hedgehog
Year Released: 1991
Composer: Masato Nakamura
Developer: Sonic Team
Platform: Sega Genesis

 

Composer: Junichi Masuda
Game: Pokémon Red and Blue (Pokémon Red and Green in Japan)
Year Released: 1996
Platform: Game Boy

 

• Game: Mega Man 3 (Capcom, 1990, NES)
• ReMixer(s): Disco Dan
• Composer(s): Harumi Fujita, Yasuaki Fujita
• Song(s): 'Title'
• Posted: 2001-11-22, evaluated by djpretzel

5
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

"...Euphrasie, three days ago, one of your journalists secretly followed a suspect all the way from the Court of Fontaine to Romaritime Harbor, and almost ended up being tied up and thrown into the sea by a gang of criminals. Whether or not there's any truth in the notion that 'nearer to the action is closer to the truth,' surely Miss Charlotte doesn't value her reports more than she does her own life?"
— Yet another exasperated exchange between Captain Chevreuse of the Special Security and Surveillance Patrol and Euphrasie, Editor-in-Chief of The Steambird

◆ Name: Charlotte
◆ Title: Lens of Verity
◆ Reporter of The Steambird
◆ Vision: Cryo
◆ Constellation: Hualina Veritas

Fontaine's famous newspaper The Steambird has a veritable legion of reporters it can call upon, each with their own area of expertise. Some specialize in celebrity gossip, others follow the word on the street, while others still focus on political affairs...

But among them all, there is one that stands head and shoulders above the rest thanks to her seemingly boundless reserve of energy and perseverance — the inimitable Charlotte.

Unswervingly committed to the principle that "nearer to the action is closer to the truth," Charlotte has a habit of popping up literally anywhere and everywhere in Fontaine — from its widest avenues to its narrowest back alleys, its highest vantage points to its lowest subterranean vaults, even its tallest mountains to its deepest undersea caverns. She captures the "truth" with her Kamera, records it in her articles, and finally unveils it for all to see.

And when the "truth" comes out, she's met with a variety of different reactions ranging from applause, to embarrassment, to outright fury. There are even some who would resort to any means necessary to make a particular article connected to themselves disappear. Or alternatively, just make Charlotte disappear.

For this reason, the newspaper's Editor-in-Chief Euphrasie has on numerous occasions felt the need to distance Charlotte from the Court of Fontaine by sending her off on faraway "field reporting" jobs, only recalling her once the Maison Gardiennage or Special Security and Surveillance Patrol had finally managed to clear things up.

But despite all this, neither the toil of the job itself nor the pressure of external denunciations and threats has ever phased Charlotte in the slightest.

With her trusty companion Monsieur Verite by her side, she invariably carries out her journalistic duties with unfaltering fervor, rushing about in pursuit of all the "truths" out there just waiting to be discovered.

view more: next ›