Hacker Newsnew | past | comments | ask | show | jobs | submit | jappgar's commentslogin

It would be useful to predict things like earthquakes and tornados. Gambling on what politicians and celebrities will do is not science it's degenerate court gossip.

Game theory can be applied to these problem, and if you can correctly model reward, cost and motives, you can predict decisions of politicians more often than not.

Even granting the idea that game theory can be applied successfully here; that does not really help with one-off events. Consider, knowing the odds of a coin flip does not grant you any real help in knowing what the next coin flip will be.

This is also ignoring that game theory of partisan games breaks if any of the participants knows what the other will do. Is one of the more famous ideas.

To that end, if you want to predict what someone will do, more often than not you are best looking at their experience doing said thing.


> you can predict decisions of politicians more often than not

What makes you believe this? The performance of economist/sociology experts using game theory to make predictions has been worse than a coin-flip up to this point. It also has done enormous damage.


I don't think Nash envisioned politicians and their aides being able to profit from making decisions. Do you launch an attack on Eastasia? Well the market right now says there's a 40% chance, so I guess it's a good idea to grab your crypto keys and make some bets before you call the Joint Chiefs.

As you can see by downvotes and comments, they still don't get it.

LLMs make developers more efficient. That much is obvious to anyone who isn't blinded by fear.

But people will respond "but you still need developers!" True. You don't need nearly as many, though. In fact, with an LLM in their hands, the poor performers are more of a liability than ever. They'll be let go first.

But even the "smart" developers will be subsumed, as vastly more efficient companies outcompete the ones where they work.

Companies with slop-tolerant architectures will take over every industry. They'll have humans working there. But not many.


> LLMs make developers more efficient.

They do not. I review a ton of code, and while the quantity is going up, the quality of that code is getting worse. LLMs only make developers more efficient if they skip the due-diligence required to verify its output; they all say they don't, and almost all of them do.


Probably it's not a perston you're answering to, so there no point to try to have a reasonable conversation.

Because a website is easier to use and more accessible.....


This one is not very accessible, try using tab + arrow keys to focus anything on the sidebar.


If that is the case, why would you have TUIs at all?


Great question.

It's funny, when you're using a claude code terminal inside vscode to use a "TUI", you're actually using a web application.


Why did they make a website?


If you're not willing to give up your RSUs you shouldn't be surprised that the executives aren't either.

The moral failing is all of ours to share.


I was willing to (and did) give up my equity.


I doubt hobbyists would describe their hobby as purgatory.

I doubt the laborer would describe their toil as "craft".


> I doubt hobbyists would describe their hobby as purgatory.

Programmers have become accustomed to a lot of cultural and financial respect for their work. That's about to disappear. How do you think radio actors felt when they were displaced by movies? Or silent film actors when they were displaced by talkies?

> I doubt the laborer would describe their toil as "craft".

Intellectual labor is labor. I'm a laborer in programming and I definitely consider it a craft. I think a lot of people here at HN do.


And they were and are of course right to feel those feelings, but it doesn't change the fact that the world is changing. Rarely do large changes benefit everyone in the world.


> And they were and are of course right to feel those feelings, but it doesn't change the fact that the world is changing. Rarely do large changes benefit everyone in the world.

I'm not sure who you are arguing against. No one here said that the world isn't changing. But it seems to me that the people who are disadvantaged by AI, which is potentially everyone who doesn't own a data center, should take efforts to ensure their continued survival, instead of merely becoming serfs to the ruling oligarchs.


Only if your flow is writing the actual code.

If you flow state involves elaborating complimentary specifications in parallel, it's marvelous


Yet no one seriously declares motor vehicles as useless.


Many who live in sufficiently-walkable areas don't have one and are actively opposed to getting one


It's funny that people blame the site for this.

That toxicity is just part of software engineering culture. It's everywhere.


Its karma farming. Number must go up regardless of the human cost. Thats why the same problem is seen here, to a lesser extent.

Karma in social media is a technology to produce competitiveness and unhappiness, usually to increase advertising engagement.

Compare how nice the people are on 4chan /g/ board compared to the declining years of SO. Or Reddit for that matter.


Real security systems don't publicize how they work.

This is just grandstanding. Half the people from this lab will go on to work for AI companies.


> Real security systems don't publicize how they work.

175 years of history would disagree with you: https://en.wikipedia.org/wiki/Security_through_obscurity


That old saw. Downvote all you want. Adversarial engineering does indeed rely on obscurity, they just don't tell you that.


I've been working in security for more than 20 years and have seen the deleterious effects of security through obscurity first-hand. Why does "adversarial engineering" rely on obscurity?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: