Hacker Newsnew | past | comments | ask | show | jobs | submit | all2's commentslogin

If you're in the market, OpenCode is quite good and has become my daily driver. You may also consider pi[0], but that's (from what I've heard) more agenty.

[0] https://shittycodingagent.ai/


You gotta make money somehow. Maybe have an optional durable+accessible storage and portal (just a SaaS and optional harddrive that you ship out or update on occasion... a miniPC that pulls from the SaaS using rsync automatically?).

You might be able to make this work if you sell enough of the SaaS subscriptions (12 bucks a month or 200 a year for perrenial backups -- ship us the device/etc. and we'll get the media into your account. You'd need 1000 customers with a 20% systems cost to do this full time, which seems reasonable).


Physically recoiling at this ceo-scum response to OPs comment

If you read the first line as “make a handsome profit”, I get it, but if you read it slightly more charitably to mean “this service [permanent backup] costs real money to operate, so you need a way to fund that somehow”, it seems perfectly reasonable to me.

Servers, storage, power, networking, and cooling aren’t free; therefore neither is reliable indefinite storage of family memories in digital form.


It is a way to make money. Provide service in exchange for money. I'm not sure what's wrong with that.

If he could figure out how to do it without ever spending money, that would be amazing and I would fully support it. As it stands, I saw what he was asking, did some math to sort out how he could manage it full time, and made a recommendation.

People are tired of SaaS, I get it, so I suppose you could ship an app to do something similar; wire it to talk to every possible imaging/recording device and then automate the 'download all pictures from this device'. But it still takes time. And potentially money.


I'm surprised it took a whole hour for someone to turn a charity into a SaaS

If we treat LLM output as a manufacturing output if you have three 80% probabilities you actually have something like 0.80.80.8 -> 0.512 or 51%.

Yes, there's a wide variety of use cases that require different ratios of accuracy/speed. If you require 3 responses to be accurate, you have to multiply all 3 response accuracy probabilities, and as you've shown, this can reduce overall accuracy quite a bit. Of course, this does make the assumption that those 3 responses are independent of one another.

One thing I considered some months ago that was very similar to what you guys have done, but at a higher abstraction layer:

1. Consult many models (or a single model with higher temp) with the same prompt

2. Intelligently chunk the outputs (by entity, concept, subject, etc.)

3. Put each chunk into a semantic bucket (similar chunks live in the same bucket)

4. Select winning buckets for each chunk.

4a. Optionally push the undervoted chunks back into the model contexts for followup: is this a good idea, does it fit with what you recommended, etc.

4b. do the whole chunk/vote thing again

5. Fuse outputs. Mention outliers.

Token spend is heavy here, where we rely on LLMs to make decisions instead of the underlying math you guys went with. IMO, the solution y'all have reached is far more elegant than my idea.


I like the direction you're going with this strategy. There are many approaches, nuances, edge cases, and clever tricks to each of these steps, even without taking into account token probability distributions. Very powerful to get it right.

I've been watching the drizzle of LLM papers come through, and I think we're going to hit a 1T param MoE on consumer hardware before this year is out. It'll still be behind the bigco models, but it'll be a force multiplier. Ideally, we'd get these models to run on a CPU. MS BitNet is one way to do this. You can already run ternary LLMs on consumer CPUs with a decent tps.

Though what is consumer hardware right now?

Can we still classify 5090s as consumer hardware given how expensive they are? They're £3k at the moment, and it looks like it's only going to get worse unless the AI bubble pops.


I got an Olares One system with a 24GB (consumer not 32GB) NVIDIA RTX 5090 for less than $3k at the Kickstarter price. It comes with Olares OS which for my purposes is not all that useful, I settled finally on a good Ubuntu 24.04 LTS configuration, but it was a good deal. I actually bought two.

I was thinking more in terms of 24GB of VRAM total. I started sketching the architecture for such a model this afternoon, nothing novel, just combining existing advancements in the field. It looks achievable.

I mean you can run a 1T model on consumer hardware now by doing things like layer offloading and streaming from SSD. It's just too slow to be useful.

We should also note that wind turbines require huge amounts of petroleum derivatives to operate.

Yeah but at least the byproducts produce a solid that can last for years vs treating it as a consumable.

I'm fulling expecting someone will reply to me and say that making plastic wastes 75% of the oil or something during production, and that it's just as wasteful amortized across the lifespan of a wind turbine. I'm tired, man.


You can compare material intensity of different electricity generation technologies.

https://davidturver.substack.com/p/material-intensity-electr...

According to International Energy Agency mineral demand for clean energy technologies would rise by at least four times by 2040 to meet climate goals, with particularly high growth for EV-related minerals.

https://www.iea.org/reports/the-role-of-critical-minerals-in...


You can recycle the minerals so it will also fall back down to almost 0 on a longer timescale.

If you keep burning gas you will never stop mining.


We have heard many claims from politician about https://en.wikipedia.org/wiki/Circular_economy

You can recycle the minerals and you should recycle minerals, but almost no recycling technology can recycle 100% of minerals and recycling has always costs attached to it (this can be for example capital costs, building recycling facilities, operating costs in form labor costs for separation, energy costs for melting material and purification processes).

For example aluminum is recycled, not because we have have a shortage of aluminium ore (Earth's mantle is 2.38% aluminium by mass), but because recycling is less energy intensive then production of fresh aluminum. https://international-aluminium.org/work-areas/recycling/

Recycling of EV batteries will lose between 1-10% of the valuable metals https://blog.ucs.org/jessica-dunn/how-are-ev-batteries-actua...

The worst kind of recycling is decreasing the costs of recycling by outsourcing to third world countries, by exploiting lax environmental regulations or corrupted environmental protection officials.

https://www.nytimes.com/interactive/2025/11/18/world/africa/...

https://en.wikipedia.org/wiki/Chittagong_Ship_Breaking_Yard

https://www.npr.org/sections/goats-and-soda/2024/10/05/g-s1-...


> aluminum is recycled... but because recycling is less energy intensive then production of fresh aluminum

So what?

> Recycling of EV batteries will lose between 1-10% of the valuable metals

How much gasoline, coal, and natural gas can you recycle?

> The worst kind of recycling is decreasing the costs of recycling by outsourcing to third world countries

That's going to happen as long as those countries are poor. They need to develop their economies quickly to demand better laws. Climate change will be a danger for many of them in the coming years.

Better, less-polluting recycling tech will help them far more than continuing to burn fossil fuels.


I just wanted to show that there no such thing as perfect recycling technology.

If you want to choose least material intensive source of energy, you choose nuclear energy. By choosing nuclear energy you get the benefit of almost decarbonizing you electricity production as can be seen in France.


Nuclear isn't perfect either. You can be embargoed for uranium way more easily, if you don't already have it. It's more expensive to build than solar and takes much longer (and don't BS me with "it's because of the regulations!" - everything, even solar, has regulations that drive up the cost and construction timelines).

If you can build price-competitive nuclear energy without government backstops or insurance, you have my blessing.

I personally think nuclear's time is in the far future when we have more advanced, exotic materials that make it radically safer and cheaper. For applications where solar isn't sufficient, such as space propulsion.


No energy technology is perfect each has it's benefits and drawbacks.

Yes a nuclear power plant more expensive than solar power plant. But an electric grid based on renewables, if we add the costs for storage, backup generator, power lines upgrades needed for smoothing out regional variations of production, is more expensive (or it can be cheaper if you have access to cheap natural gas, Texas power grid).

https://doi.org/10.1080/14786451.2024.2355642

https://www.olivierdeschenes.org/uploads/1/3/6/6/136668153/j...

Uranium is plentiful. Uranium is more plentiful than antimony, tin, cadmium, mercury, or silver, and it is about as abundant as arsenic or molybdenum. https://en.wikipedia.org/wiki/Uranium#Occurrence https://en.wikipedia.org/wiki/Uranium_reserves Also uranium is very, very energy dense. Current nuclear fuel can provide up to 45 gigawatt-days per metric ton of uranium. So stockpiling few years worth of fuel is not a problem. https://www.nrc.gov/reading-rm/doc-collections/fact-sheets/b...

Governments are always supporting new technologies https://bidenwhitehouse.archives.gov/cleanenergy/tax-guidanc... https://en.wikipedia.org/wiki/German_Renewable_Energy_Source... https://emagazine.com/clean-energy-subsidies-explained-how-t...

Nuclear energy also quite safe https://ourworldindata.org/grapher/death-rates-from-energy-p...


> But an electric grid based on renewables, if we add the costs for storage, backup generator, power lines upgrades needed for smoothing out regional variations of production, is more expensive

Show your work. I'm telling you right now you're wrong. https://news.ycombinator.com/item?id=45446112

Even the Texas power grid makes heavy use of wind and solar.

> So stockpiling few years worth of fuel is not a problem

Weird you were oddly concerned about being "China dependence for PV" but this you just wave away. Stockpiling a few decades of PV and batteries is also not a problem.

"Rare earths" (not really used in panels) are plentiful too. Refining them is polluting and low-margin so developed countries prefer not to deal with them. Btw uranium is the same.

> Nuclear energy also quite safe

I didn't say anything about it being unsafe. But making it that safe currently costs a lot of money in materials, labor, and regulations.

Honestly it feels like you decided beforehand "nuclear is the way" and are trying to make every fact fit that. Or you're a troll/paid off by Big Oil. Sorry.


Texas Net Electricity Generation by Source, 2025

https://en.wikipedia.org/wiki/Energy_in_Texas

Natural gas (40.5%) Coal (12.7%) Other fossil (1.01%) Nuclear (8.47%) Renewable - Wind (23.2%) Renewable - Solar (13.7%) Other Renewables - Hydro, Biomass (0.18%)

Texas has access to cheap natural gas, which is used when renewable don't deliver. In Texas the price is king.

> Big Oil. Sorry.

I would really prefer high world-wide carbon tax. World-wide because cheap Chinese poly-silicon production for cheap PV is an excellent example of carbon leakage.

https://www.sciencedirect.com/science/article/abs/pii/S09596...

https://en.wikipedia.org/wiki/Carbon_leakage

https://www.vox.com/energy-and-environment/2017/4/18/1533104...


As opposed to gas or coal turbines which are naturally lubed somehow?

That's a pretty excellent take, IMO. Just an undirected AI model doesn't do much, especially when the core team has time with the code, domain expertise, _and_ Claude.

There are multiple efforts for 'folding at home' but for AI models at this point. I get the impression that we will see a frontier model released this year built on a system like this.

Because models are quickly moving toward commoditization, whether the big three like it or not. The differentiator now is tooling around those models. By eliminating OpenCode's auth stuff, they prevent leaking customers onto another platform that allows model choice (they will likely lose paying customers to one of the major inference catalogs like OpenRouter once they move from Claude Code to OpenCode).


So... iirc Korean words are constructed out of symbols, would it be possible to mutate the meaning of keywords by giving the symbols meaning and constructing new blocks of symbols?


That's a fascinating idea! Hangul is indeed compositional — 한 = ㅎ + ㅏ + ㄴ — so in theory you could assign meaning to individual jamo components.

But in practice, breaking syllables into jamo would make keywords less readable, which goes against Hangul's design goal. And considering how AI-assisted coding works today, fully named descriptive keywords actually reduce errors — LLMs perform better with explicit, unambiguous tokens than with cryptic symbol compositions.

So Han leans toward more descriptive Korean keywords rather than shorter symbolic ones. Readability over brevity.

Interesting direction to think about though — thanks for the question.


Dice the mayo and sticks of RAM and place in a cast iron skillet over medium heat. Turn it every two or three minutes. Remove when you can smell the magic smoke.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: