I'm having trouble finding dictionaries or other references that add the qualifier that it needs to be self-tested and not relying on the research of others. Can you point me to one?
Are you comparing an LLM running on a laptop to a Porsche?
I just find it really funny people are willing to write things like "empirically speaking, X is obvious" without actually testing it themselves.
I've seen mixed reviews, and the most honest sounding ones have said it has latency issues.
I don't really care that much what the average LLM power user says at this point, they're impressed by anything an LLM does. They're like toddler's entertained by the sound their Velcro shoes make.
You LLM people are going to be like my mom, once she got an Maps app she completely gave up on navigating anywhere with her own brain, and is lost without a phone.
Except for you LLM people, its going to be reading, writing, problem solving and thinking in general. You'll be completely reliant on an llm to get anything done, have fun with that. You're cooked bro.
Not even close. "I made this DSP task faster by focusing on exactly one compute graph on one machine instead of a compute graph compiler that runs on every possible machine" is a real engineering approach, and the AI usage is incidental. Things like Gas Town are self-serving turboslop whose only purpose is to generate more slop.
The reactive rather than proactive strategy circa 23-24 is now telling. The market/tech zeitgeist forced their hand into producing the flop Apple Intelligence and partnership with OpenAI. Their leaders were caught sleeping at the wheel, particularly ineffective was John Gianneadrea (who was subsequently forced out).
In addition, they have lost talent in consumer devices to OpenAI, though it remains to be seen whether they themselves can produce something better than Humane (also ex-Apple folks)
No doubt Apple is working on something in the background, and there are sparks of hope for them in the ML community with Local LLMs on Apple Silicon.
> I imagine the C-suite might get over it at some point.
Not at Apple. They have a looooooong institutional memory that's passed down. They're still pissed at Gizmodo. I'm shocked they made a deal with Intel. I think if it hadn't been for the global political uncertainty right now they would not have signed that fab deal with Intel.
Maybe with Tim and Jensen going on holiday together in China, the relationship might be healed somewhat.
Things have moved on since the days where GPUs in Macs were a priority.
But then the AI race has changed things. So who knows - maybe we will one day see official eGPU support from Apple and new drivers from nVidia. Wouldn't put on money on it though....
> I still believe the lack of NVIDIA GPU support in the Mac Pro will go down as one of the greatest missed opportunities in tech.
I don’t know about that. Apple supported some full size GPUs in past product lines and the number of users was very small. Granted, LLMs change that demand but the audience for Mac Pro buyers who would use a full-size GPU that is impossible to obtain is almost nothing compared to their laptop sales.
Part of the reason the new Mac Pro failed to find an audience can definitely be blamed on macOS' hostility to third party hardware. Who knows what Apple would be worth if they beat Nvidia's Grace CPU to the datacenter market. It was certainly their opportunity.
True, they could do any number of things. But a datacenter play would appear quite random to investors and their core audience. Broadcom + Nvidia however...
The Mac Pro isn't a Macbook Pro. It has socketed PCI slots and should be able to support the user's hardware in macOS' software, regardless of how Apple feels.
Seriously, the decades-long grudge against Nvidia that we always hear about seems like the most ridiculous and immature business move. I expect that kind of thing from an individual, you know, “I NEVER fly American Airlines!!!” but in business, such a permanent ban on one of the two players in a market, the leader no less. I don’t get it.
Maybe it doesn’t matter that much now because they’ve literally exited all the businesses where an external GPU is going to matter. But sticking with AMD all that time out of spite is just wild.
This AI memory crisis will be their last big payday. I think the Chinese will take advantage of this and take over the memory market worldwide, excluding the United States and some parts of Europe. The rest of the world will end up using Chinese memory.
reply