Hacker Newsnew | past | comments | ask | show | jobs | submit | qalmakka's commentslogin


No plugins? Those are what give NP++ its real power and usability - for example I use the XML and JSON pretty print functionality daily (on Windows, on my work machine).

Otherwise Kate or Gedit are just fine for Linux.


Yes. My comment above.


I think they're in a win-win situation. Big AI companies would love to see local computing die in favour of the cloud because they are well aware the moment an open model that can run on non ludicrous consumer hardware appears, they're screwed. In this situation Nvidia, AMD and the like would be the only ones profiting from it - even though I'm not convinced they'd prefer going back to fighting for B2C while B2B Is so much simpler for them


If you want to run AI models at scale and with reasonably quick response, there's not many alternatives to datacenter hardware. Consumer hardware is great for repurposing existing "free" compute (including gaming PCs, pro workstations etc. at the higher end) and for basic insurance against rug pulls from the big AI vendors, but increased scale will probably still bring very real benefits.


Currently, yes. But I don't find it hard to imagine that in a while we could get reasonably light open models with a level of reasoning similar to current opus, for instance. In such a scenario how many people would opt to pay for a way more expensive cloud subscription? Especially since lots of people are already not that interested in paying for frontier models nowadays where it makes sense. Unless keep on getting a constant, never ending stream of improvements we're basically bound to get to a point where unless you really need it you are ok with the basic, cheaper local alternative you don't have to pay for monthly.


I think average users are already okay with the reasoning level they'd get with current open models. But the big AI firms have pivoted their frontier models towards the enterprise: coding and research, as opposed to general chat. And scale is quite important for these uses, ordinary pro hardware is not enough.


This is really just a question of product design meeting the technology.

Today, lots of integer compute happens on local devices for some purposes, and in the cloud for others.

Same is already true for matmul, lots of FLOPS being spent locally on photo and video processing, speech to text, …

No obvious reason you wouldn’t want to specialize LLM tasks similarly, especially as long-running agents increasingly take over from chatbots as the dominant interaction architecture.


> If you want to run AI models at scale and with reasonably quick response, there's not many alternatives to datacenter hardware.

Right now, certainly. Things change. What was a datacenter rack yesterday could be a laptop tomorrow.


At a consistent amount of usage, datacenters are at least an order of magnitude more hardware efficient. I'm sure Nvidia and AMD would be fine fighting for B2C if it meant volume would be 10+x.

Now, given they can't satisfy current volume, they are forced to settle for just having crazy margins.


The problem with B2C is that you need to have leverage of some kind (more demanding applications, planned obsolescence, ...) in order to get people to keep on buying your product. The average consumer may simply consider themselves satisfied with their old product they already own and only replace it when it breaks down. On the contrary, with the cloud you can keep people hooked on getting the latest product whether they need it or not, and get artificial demand from datacentres and such.


I think businesses running datacenters are much less likely to frivolously buy the latest GPUs with no functional incentive than general consumers are...


Future upgrade cycles on phones and laptops, PCs, will be driven by SOCs that embed some type of ASIC that run a specific model. Every 6 months there will be a new, better version to upgrade to, which will require a new device. This is how Apple will be able to reduce cycles from 3 years to 6-12 months.


There are also many Chines AI-target GPU/NPU producers. You can get a hold of some boards on taobao.com. They are usable in some way.

No, nVidia and AMD are not the only ones benefiting.


CLI is massively superior to MCP in my experience. First, because I also understand what's going on and do it myself if necessary. Second because it's so much cheaper in terms of tokens it's not even funny


NAT is arguably a very broken solution.IPv4 isn't meant to be doing address translation, period. NAT creates all sorts of issues because in the end you're still pretending all communications are end to end, just with a proxy. We had to invent STUN and all sorts of hole punching techniques just to make things work decently, but they are lacking and have lots of issues we can't fix without changing IPv4. I do see why some people may like it, but it isn't a security measure and there are like a billion different ways to have better, more reliable security with IPv6. The "I don't want my devices to have public, discoverable IPs" is moot when you have literally billions of addresses assigned to you. with the /48 your ISP is supposed to assign you you may have 4 billion devices connected, each one with a set of 281 trillion unique addresses. You could randomly pick an IP per TCP/UDP connection and not exhaust them in _centuries_. The whole argument is kind of moot IMHO, we have ways to do privacy on top of IPv6 that don't require fucking up your network stack and having rendezvous servers setting that up.

We may also argue that NAT basically forces you to rely on cloud services - even doing a basic peer to peer VoIP call is a poor experience as soon as you have 2 layers of NAT. We had to move to centralised services because IPv4 made hosting your own content extremely hard, causing little interest in symmetrical DSL/fiber, leading to less interest into ensuring peer to peer connections between consumers are fast enough, which lead to the rise of cloud and so on. I truly believe that the Internet would be way different today if people could just access their computers from anywhere back in the '00s without having to know networking


And the worst part about CGNAT is that you have two bad solutions:

Either EIM/EIF (preferably with hairpinning) where you can practically do direct connections but you have to limit users to a really low number of "connections" breaking power users.

Or EDM/EDF where users have a higher number of "connections" but it's completely impossible to do direct connections (at least not in any video/voice calling system).


Ironically, SQL Server AFAIK in order to run on Linux uses what basically amounts to a Microsoft reimplementation of Wine. Which always makes me wonder if they'll ever get rid of Windows altogether someday in favour of using Linux + a Win32 shim. I think there are still somewhat strong incentives nowadays to keep NT around, but I wouldn't be that surprised it this happened sometime down the line.


It's a Windows container. It runs the NT kernel and a few minimal other things. The closest would be the Nano Server container


AFAIK it's more like a reimplementation of NT APIs in userspace - aka basically Wine with extra steps, or Linux UM. There was a slide deck going around about Project Drawbridge, here: https://threedots.ovh/slides/Drawbridge.pdf


LPG isn't a good metric of a "weird" fuel, there are countries such as Italy where it's immensely popular


This. The comparison between compilers and LLMs is so utterly incorrect, and yet I've heard it multiple times already in the span of a few weeks. The people suggesting this are probably unaware of the fact that Turing complete languages follow mathematical properties not just vibes. You can trust the output of your compiler because it was thoroughly tested to ensure it acts as a Turing machine that converts one Turing complete language (C, C++, whatever) into another Turing complete language (ASM) and there's a theorem that guarantees you that such a conversion is always possible. LLMs are probabilistic machines and it's grossly inappropriate to put them in the same category as compilers - it would be like saying that car tires and pizzas are similar because they're both round and have edges.


There's a big difference between running native ARM software on ARM and emulating x86 to run Windows. If this Mac was x86, it could have probably run Windows much faster thanks to virtualization


On Apple silicon, Parallels can’t run x64 windows, it is using the ARM version of Windows. The x64 emulation is provided by Windows. Of course this is inefficient, but not everything is automatically 2x slower: any OS code you invoke is not running as x64 emulation, and IO and memory access is not penalized by the emulation (but certainly somewhat from virtualization). I was pleasantly surprised how fast you can run x64 windows apps.


Yeah I wasn't aware that Microsoft allowed that nowadays. Still, it's not ideal anyway, because in my experience Windows apps that are compatible with ARM are 90% either FOSS or portable on other platforms anyway. You use Windows to use x86 apps; if you don't need x86 apps you are generally better not using Windows at all, and if you need them they'll probably run poorly on ARM due to multiple layers of emulation. Wine is still an option, though. They support Rosetta on Mac and FEX/Box64 on Linux, so they may lead to better performance than Parallels

> I was pleasantly surprised how fast you can run x64 windows apps

In general as long as you have a fast enough machine emulation isn't that bad. Apple was doing that already for 68k with PPC and most people didn't noticed due to how massively faster their first PPC computers were. Still, the issue is that here we're not really talking about a high-end CPU aren't we


It's a paramount imperative for Europe to wean itself from fossil fuels, regardless of environmental arguments (which are extremely relevant still). Getting a safe, unfettered provider of fossil fuels is getting a basically unsolvable problem. China is trying to build as much solar and nuclear capacity as humanly possible; we should do the same too. We've been having these energy shocks since the Yom Kippur war basically, it's like a broken cycle of instability and crisis we can't leave behind. There's no shale to be found in Europe, we just have wind, sun and nuclear to save our backs. And maybe geothermal pretty soon?


I don't understand leadership thinking. Surely spending €250B on a continental scale renewable energy project would have a relatively short payoff time (on country scale) given the instability of relying on foreign energy sources. I mean how long does oil have to sit above $100/barrel before it costs everyone that much anyway?


The recent "Offshore Wind Investment Pact" announcement was aiming for 1 Trillion euro of investment into North Sea wind by 2040.

Plus there's lots of other stuff happening. Also lots of pushback from those clinging to fossil fuels.


> spending €250B on a continental scale renewable energy project

Let me stop you there: the EU budget for 2026 was €193B. https://eur-lex.europa.eu/budget/www/index-en.htm

Basically Europe doesn't have political leadership, nor does the EU itself have a budget larger than the member states like the US Federal budget. In return, the EU, primarily Germany, has imposed "fiscal discipline" which prevents running a short term large deficit in order to make this kind of capital investment.

Also, two hundred billion Euro is a lot of money for anyone who isn't an AI startup.


What is the value of the EU if's it not coordinating multi-national scale efforts?

This would need to be a joint venture as some places are really good for wind, and some places are really good for solar, but not every country on their own has access to those locations. The budget for the EU doesn't matter, because this project would be a separate line item with it's own funding.

Energy independence is extremely valuable. Way way way more valuable then $250B or even $500 or $750B for that matter. Society runs on energy, and if it's not fully yours, you are always a rug pull away from social collapse.

If 2022 was a cold winter, and America had a cold leader, this project probably would have breezed through the bureaucracy in a week.


> What is the value of the EU if's it not coordinating multi-national scale efforts?

Remember the EU is just a fancy self-updating free trade agreement, not a nation.

The coordination that the member states have thus far allowed the EU to take responsibility for is ~ "make all our rules be equivalent so everyone's degrees are accepted everywhere, everyone's food is accepted everywhere, we all agree what counts as a safe consumer product, limited range for tax shenanigans, etc."

(And for this, they get denounced as "complex" and "bureaucratic").

Actual direct investments do also exist, I just missed out on one for startups 20 years back apparently due to a rules change, but it's peanuts compared to what member state governments do directly.


>> Energy independence is extremely valuable.

Not if your top politicians are on putin's payrol like Orban and Merkel.


The EU is not funding projects directly, it's setting the rules. Individual governments pay the bill


Is this perhaps changing with Macron indicating Europe will keep the €300B Europe has been investing in the US annually in Europe?

Macron says €300B in EU savings sent to the US every year will be invested in EU - https://news.ycombinator.com/item?id=46722594 - January 2026 (207 comments)

Europe can go fast when it wants to.

How Europe Ditched Russian Fossil Fuels With Spectacular Speed - https://www.bloomberg.com/news/features/2023-02-21/ukraine-n... | https://archive.today/yxGp2 - February 21st, 2023

> But what the past year has shown is that it’s possible to go harder and faster in deploying solar panels and batteries, reducing energy use, and permanently swapping out entrenched sources of fossil fuel.

> Solar installations across Europe increased by a record 40-gigawatts last year, up 35% compared with 2021, just shy of the most optimistic scenario from researchers at BloombergNEF. That jump was driven primarily by consumers who saw cheap solar panels as a way to cut their own energy bills. It essentially pushed the solar rollout ahead by a few years, hitting a level that will be sustained by EU policies.


This is not entirely correct. The EU actually does both — it sets regulatory frameworks and funds projects directly through several mechanisms.


It's a mix between decades of brainwashing, fossil-lobby having the bigger paychecks, unstable times, and all leaders fear to invest in something new and uncertain. In every industry/organization there is the old saying that nobody ever gets fired for supporting/using the established solutions. This is the same situation, there is more motivation for staying with the known paths, especially after there is strong propaganda against the new paths.


The problem is that 30-50% of voters would just look at that and say:

Why are you spending €250B on corporate subsidies instead of giving us €250B?!


I think the lesson of the UK winter fuel subsidy payment is that while it feels great in year one, it doesn't actually solve any of the problems, and then the voters get incredibly mad if you try to take it away again.


everything of importance ever done in EU was in response to a major crisis, in no small part because these exact voter emotions are dampened in such times.


This can be viewed as a feature not a bug. The defining feature of a republic is stability. Orderly and lawful transfers of power. Not rocking the boat. Deliberative processes. If the people are enjoying prosperity and peace why make drastic changes? So yes when the situation is extreme that's when big shifts in policy happen.


"Never let a good crisis go to waste."


>The problem is that 30-50% of voters would just look at that and say: Why are you spending €250B on corporate subsidies instead of giving us €250B?!

Why is it a "problem" for voters (aka the taxpayers) to ask such questions to their leaders to justify on how their tax money is being spent? To me this feels like basic transparency that keeps democracy in check.

To me it's the problem if politicians don't have or don't want to answer those questions because then, either they're grifting or they're incompetent.

It's not like we don't have a laundry list of mismanagement, couch corruption cough, of governments spending money on bullshit with nothing to show for, while stuff healthcare keeps being underfunded.

So yeah, if you spend my money, you better have an answer.


Wind, sun and geothermal we have. Albeit technology to harvest them seldom come from the Europe. But getting a safe, unfettered provider of nuclear fuel risk to be just another unsolvable problem.


[flagged]


You've been told multiple times that this is a lie by omission. Why do you persist with it?


Honest question: why would anyone use Vim and not NeoVim nowadays? I've switched what, 12 years ago? And I've never had to look back. Just curious, to be honest. Especially since neovim is full of new features, while the Vim9 scripting language kind of tanked


I'll field this one as someone who has used regular ol' Vim for ~18 years and never switched. Why switch if your tool is working fine? I use vim literally every day all day long and it does everything I need it to do. Switching has a cost and there's no reason to pay it if it's working fine.


I originally switched because neovim was more polished, had better plugins and Lua config files. I then never had a reason to go back


I think I first switched because neovim supported the XDG config location, I could have ~/.config/nvim/init.vim instead of ~/.vimrc.


I ended up switching for plugin support. Other than that, unless you want to use Lua for your config files, I don't see a reason to switch either.


Because I don't choose what tools are available on every server at work, and it's guaranteed that at the very least old-school vi is installed on every linux server, and often vim. Maintaining that muscle memory is useful.


I used to think this too, but I routinely switch back and forth between neovim and vim now for close to a decade, and I've never noticed. In fact I often don't even notice which one I'm using unless I explicitly check. Once you add neovim-only plugins that can change of course, but if you can't choose what tools are available on the server then I would imagine you're not installing plugins anyway.


One reason might be how off-putting the Neovim community is, hijacking Vim discussions to denigrate an all-time-great, beloved work of technology and its creator (who did decades of work for free, gave it to the world, and gave any money to actual orphans) all for Neovim users'/devs' own egos, promotion, and obsession. Almost all of Neovim was made by Moolenaar, from concept to execution, and I don't know that I've ever seen any gratitude.

I've never seen Vim users do that. If I had to choose, I'd use Vim.


Just want to say that although I don't use either Vim/Neovim, I feel grateful for what Vim has done. Vim keybindings can be used by a multitude of editors and you can even have the keybinding concept into browsers and other software's.

Its truly revolutionary when one thinks about it how much impact Vim has on terminal users.

(Neovim's plugin system is nice but I agree with ya that I also feel like some aspects of community often don't appreciate Bram because of the Vim vs Neovim thing from my observation) It's best if instead of treating it as Vim vs Neovim, we use the tools that we prefer and appreciate the tools other are using too and the contribution of one in another. Appreciating Vim doesn't make your appreciation for Neovim lesser, appreciating both can be great. Something which is hard within Editor space in general.

Rest in peace Bram.


Can't say I really interact with the "community", I installed the program and I use it a lot. I am grateful for the existence of vi and vim. I now use neovim where I can. vim or vi as needed.


I use both gvim on linux and macvim on mac for a lot of things--not 'real' coding, typically, but opening and editing scripts and config files, writing in markdown, etc; I'm usually opening these from dolphin or finder. In the terminal, working on real code bases and not scripts, I use neovim. My configs for these have diverged a bit over the years but since the use cases are different, it doesn't bother me.


I didn't switch because there was no reason to. And there is still none.


Have you ever called neovim inside a venv? Didn't work for me (or maybe I'm too lazy to jump hoops, if vim is working out of the box).


muscle memory mainly, I guess?

Sure, switching might not be that troublesome, but I can tell you the first 48 hours or so will be painful, you'll insert stray ":" and "i" characters everywhere :)


I barely use vim these days, and I still do that in every text editor.


gvim?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: