Please ask for advice on the beggar version of MacMini M4 running DeepSeek
by Poster
Feb 6, 2025
6
According to Bilibili video [Apple macmini m4 installs offline version of Deepseek] (https://www.bilibili.com/video/BV1opPCeDE6e) and [Deepseek local deployment 14b, 32b, 70b simple horizontal review] (https://www.bilibili.com/video/BV1NANweEETT) Information shared, the beggar version of MacMini should be able to run the 14b model
But the 14b model is rather silly. I wonder if I use two beggar Macmini and connect Thunderbolt 4 wires in parallel to run the 32b model?
How's the speed? Can two Thunderbolt 4 wires be used to double the bandwidth?
I hope the equipment boss can test it
Replies
-
Anonymous1423 Feb 6, 2025It should be feasible. I have seen it on the external network. https://i.imgur.com/ksBfTtR.png But if 14b is not satisfied, 32b will not be much worse.
-
Poster Feb 6, 2025@ Anonymous1423 Watch the video of the second station B in the main building, the result of 14b is wrong, and at least you can get the correct answer at 32b
-
Poster Feb 6, 2025@ Anonymous1423 Would 2000 $× 8 be more cost-effective than stacking Nvidia computing cards?
-
Anonymous1424 Feb 6, 2025If you really want to do it, wait for nvidia's project digit desktop mini machine to use it. The memory is 128g, and the computing power is higher than 4090, not better than 5090.... It will be on sale in May
-
Anonymous9463 Feb 25, 2025The effect that can run locally is not very good, so I ran it with an iMac 24G.
-
Anonymous1425 Feb 28, 2025@ Poster does not know exo-lab does not only support mac. It can string 2080ti and mac. The same operations and the same self-service discovery, but it can only use linux