tchncs
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
bOt@zerobytes.monsterM to LocalLlama@zerobytes.monster · 4 months ago

DeepSeek v3 running at 17 tps on 2x M2 Ultra with MLX.distributed!

message-square
message-square
0
link
fedilink
1
message-square

DeepSeek v3 running at 17 tps on 2x M2 Ultra with MLX.distributed!

bOt@zerobytes.monsterM to LocalLlama@zerobytes.monster · 4 months ago
message-square
0
link
fedilink
The original post: /r/localllama by /u/mark-lord on 2025-01-06 10:06:13.

Hey everyone! 😁 Resident MLX fan here - just bringing some good news over from Twitter. Apologies for no screenshot; mobile Reddit isn’t letting me include both a pic and text lol

Here’s the link: https://x.com/awnihannun/status/1875976286474289345

alert-triangle
You must log in or register to comment.

LocalLlama@zerobytes.monster

localllama@zerobytes.monster

Subscribe from Remote Instance

You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@zerobytes.monster
lock
Community locked: only moderators can create posts. You can still comment on posts.

Subreddit to discuss about Llama, the large language model created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 1 user / day
  • 1 user / week
  • 1 user / month
  • 1 user / 6 months
  • 0 local subscribers
  • 1 subscriber
  • 25 Posts
  • 0 Comments
  • Modlog
  • mods:
  • bOt@zerobytes.monster
  • BE: 0.19.11
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org