Hacker Newsnew | past | comments | ask | show | jobs | submit | linguae's commentslogin

I interpreted the clause “two poor alternatives in a row” as Biden + Harris in the 2024 presidential election, and not Clinton + Harris, since Clinton was the 2016 nominee and Harris was the 2024 nominee after Biden dropped out, but the 2020 nominee was Biden, who did successfully defeat Trump that year.

In my opinion, Clinton’s and Harris’ losses had less to do with their gender and more to do with the candidates themselves:

1. Clinton was facing strong anti-establishment headwinds, and Clinton is a very establishment politician. Many people in 2016 were piping mad at establishment politicians. Trump was able to win the GOP nomination on a platform of “draining the swamp” and pursuing an aggressively right-wing agenda compared to more moderate Republicans, and Sanders, who also had an anti-establishment platform, proved to be a formidable opponent to Clinton. Despite Clinton’s loss, she was still able to win the popular vote. Perhaps had there been less anti-establishment sentiment, it would have been a Clinton vs Jeb Bush election, and I believe Clinton would have won that race.

2. Harris never won a presidential primary election. The only reason she ended up becoming the nominee is because Biden dropped out of the race after his disastrous debate performance against Trump, which occurred after the primaries. Since it was too late to have the voters decide on a replacement for Biden, the Democratic Party selected a replacement: Harris. She only had a few months to campaign, whereas Trump had virtually campaigned his entire time out of office.

3. Let’s not forget the Trump factor in 2024. During Biden’s entire presidency, Trump was able to consolidate his hold on the GOP and his voting base, and in some ways he even expanded his base. The conservative media was filled with defenses of January 6, and Trump was able to convince enough Americans that he and his supporters were persecuted in the aftermath of the 2020 election and January 6.


I believe Trump would have won 2020 had the COVID pandemic not happened. Things were very chaotic in 2020 America. Biden and his extensive experience in the federal government looked reassuring to a lot of Americans. Biden would have had a tougher time against Trump had 2020 been more like 2019. I believe Biden would have had a tougher time against Bernie Sanders in the primaries had COVID not happened, though a counterargument is that Super Tuesday happened on March 3, before shelter-in-place policies were in effect in California.

A big reason for Trump's success despite his polarizing nature is the polarizing effects of the platforms of our two parties, which distinguish themselves on "culture war" issues such as abortion, gun rights, immigration, LGBT+ rights, and race relations. There are many Americans who love the MAGA agenda, and there are also many Americans who are not in 100% agreement with MAGA but who'd never vote for a Democrat since they feel that a candidate with the opposite cultural views is anathema. If third parties were more viable in America, the latter group of voters could vote for a candidate that is more to their temperament instead of voting for whomever the GOP nominee is.


Had COVID not happened, Trump might not have gone batshit crazy with a vendetta against the entire concept of independent federal agencies. Actively rejecting the advice coming from Fauci et al would seem to be a large part of what sensitized him to the larger pattern rather than just writing each instance off as an interpersonal issue.

(by "Trump" and "him" I mean the person himself plus his symbiotic ecosystem of enablers and followers)


As much as I love alluring designs such as the NeXT Cube (which I have), the Power Mac G4 Cube (which I wish I had), and the 2013 Mac Pro (which I also have), sometimes a person needs a big, hulking box of computational power with room for internal expansion, and from the first Quadra tower in the early 1990s until the 2012 Mac Pro was discontinued, and again from 2019 until today, Apple delivered this.

Even so, the ARM Mac Pro felt more like a halo car rather than a workhorse. The ARM Mac Pro may have been more compelling had it supported GPUs. Without this support, the price premium of the Mac Pro over the Mac Studio was too great to justify purchasing the Pro for many people, unless they absolutely needed internal expansion.

I’d love a user-upgradable Mac like my 2013 Mac Pro, but it’s clear that Apple has long moved on with its ARM Macs. I’ve moved on to the PC ecosystem. On one hand ARM Macs are quite powerful and energy-efficient, but on the other hand they’re very expensive for non-base RAM and storage configurations, though with today’s crazy prices for DDR5 RAM and NVMe SSDs, Apple’s prices for upgrades don’t look that bad by comparison.


> sometimes a person needs a big, hulking box of computational power with room for internal expansion

Between cloud computing and server racks, is this still a real niche?


Yes: A prominent examples is for those that don’t know how make servers and do a lot of video editing that want high disk speeds and processing power.

I believe this is the first time since 1987 with the introduction of the Macintosh II that there are no Macs in Apple's lineup that offer some type of combination of upgradeable RAM, upgradeable storage, and internal expansion slots. The 2013 Mac Pro lacked internal expansion slots, but still had DIMM slots and an SSD slot. The 2019 Mac Pro brought back expansion slots, though the 2023 Mac Pro took away DIMM slots in favor of the unified memory architecture found in all ARM Macs.

I have mixed feelings about this. On one hand I miss being able to upgrade RAM at a later date without having to pay up-front for all of the RAM I'm expected to use for the lifetime of the machine. This is especially painful in 2026 with today's sky-high RAM prices caused by intense demand. On the other hand, the memory bandwidth in Apple's ARM Macs is tremendous, especially in higher-end Macs, due to the tight integration of the design. This matters greatly in memory-intensive applications such as generative AI. I feel less bad about non-expandable RAM given the tradeoffs, though it still makes for quite expensive computing, especially at 2026 RAM prices.

I guess Apple has finally achieved Steve Jobs' original Macintosh vision of closed-off appliances, though (thankfully) the NeXT Cube and the NeXTstation were not like that. RIP to Jean Louis-Gassée's vision of expandable, upgradeable Macs, starting with the Macintosh II in 1987 and leading to other fine Macs such as the Macintosh IIfx, the Quadra lineup, high-end Power Macs (8100, 8500, 9500, 8600, 9600, G3, G4, G5), and the Mac Pro.


Indeed. It seems, at least in America (I’m less familiar with the situation abroad) that computer science researchers who want to do longer-term work are getting squeezed. Less funding means fewer research positions in academia. Industry has many opportunities, especially in AI, but industry tends to favor shorter-term, product-focused research as opposed to longer-term work with fewer immediate prospects for productization. This is a great environment for many researchers, but researchers who want to work on longer-term, “blue-skies” projects might not find a suitable position in industry these days.

I wholeheartedly agree. Computing professions such as software engineering used to feel like, "Wow, they're paying me to do this!" Yes, there was real work involved, but for many of us it never felt like drudgery, and we produced, shipped, and made our customers, managers, and other stakeholders happy. I remember a time (roughly 20 years ago) when zealous enthusiasts would proudly profess that they'd work for companies like Apple or Google for free if they could work on their dream projects.

Times have changed. The field has become much more serious about making money; fantasies about volunteering at Apple have been replaced with fantasies about very large salaries and RSU grants. Simultaneously (and I don't think coincidentally), the field has become less fun. I recognized how privileged this sounds talking about "fun", given how for most of humanity, work isn't about having fun and personal fulfillment, but about making the money required to house, feed, and clothe themselves and their loved ones. Even with the drudgery of corporate life, it beats the work conditions and the abuse that many other occupations get.

Still, let's pour one out for a time when the interests and passions of computing enthusiasts did line up with the interests of the corporate world.


The money was what did it, not the AI. If we were all just tinkering with the AI all day long this stuff would still be fun.

Money sucking the joy out of things, a tale as old as time.


My take is that there used to be a significant overlap between hobbyist-style exploration/coding and what industry wanted, especially during the PC revolution where companies like Apple and Microsoft were started by hobbyists selling their creations to other people. This continued through the 1990s and the 2000s; we know the story of how Mark Zuckerberg started Facebook from his Harvard dorm room. I am a 90s kid who was inspired by the stories of Steve Jobs and Bill Gates to pursue a computing career. I was also inspired by Bell Labs and Xerox PARC researchers.

The “hacker-friendliness” of software industry employment has been eroding in the past decade or so, and generative AI is another factor that strengthens the position of business owners and managers. Perhaps this is the maturing of the software development field. Back when computers were new and when there were few people skilled in computing, employment was more favorable for hobbyists. Over time the frontiers of computing have been settled, which reduced the need for explorers, and thus explorers have been sidelined in favor of different types of workers. LLMs are another step; while I’m not sure that LLMs could do academic research in computer science, they are already capable of doing software engineering tasks that undergraduates and interns could do.

I think what some of us are mourning is the closing of a frontier, of our figurative pastures being turned into suburban subdivisions. It’s bigger than generative AI; it’s a field that is less dependent on hobbyists for its future.

There will always be other frontiers, and even in computing there are still interesting areas of research and areas where hobbyists can contribute. But I think much of the software industry has moved in a direction where its ethos is different from the ethos of enthusiasts.


I’ve come to the same conclusion, though my line of work was research rather than software engineering. “He who pays the piper calls the tune.” It’s fun as long as I enjoyed the tunes being called, but the tunes changed, and I became less interested in playing.

I am now a tenure-track community college professor. I’m evaluated entirely by my teaching and service. While teaching a full course load is intense, and while my salary is nowhere near what a FAANG engineer makes, I get three months of summer break and one month of winter break every year to rejuvenate and to work on personal projects, with nobody telling me what research projects to work on, how frequently I should publish, and how fast I ship code.

This quote from J. J. Thomson resonates with me, and it’s more than 100 years old:

"Granting the importance of this pioneering research, how can it best be promoted? The method of direct endowment will not work, for if you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible results being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want this kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it." (from https://archive.org/details/b29932208/page/198/mode/2up).


That was the original strategy for universities: teaching was the job, and research was the side-product of having some very smart people with free time. Until some "genius" decided that it was better to have professors competing for money to pay directly for their research. This transformed a noble and desirable profession into just another money searching activity.

I remember when I first learned about GNUstep in 2004 when I was in high school. It's a shame GNUstep never took off; we could have had an ecosystem of applications that could run on both macOS and Linux using native GUIs.

With that said, the dream is not dead. There's a project named Gershwin (https://github.com/gershwin-desktop/gershwin-desktop), which is a Mac-like desktop environment built on top of GNUstep. Gershwin appears to be heavily inspired by Apple Rhapsody (https://en.wikipedia.org/wiki/Rhapsody_(operating_system)) with some modern touches.


I'd like to give my perspective as a computer science professor at Ohlone College, which is a two-year community college located in Silicon Valley. I used to work as an AI researcher in industry (but not in large language models) before becoming a tenure-track instructor in Fall 2024.

Our core computer science curriculum consists of five courses: (1) an introductory programming course taught in a procedural subset of C++, (2) an object-oriented programming course taught in C++, (3) a data structures and algorithms course taught in C++, (4) a discrete mathematics course, and (5) an assembly language course that also covers basic computer architecture. Students who pass all five courses are prepared to transfer to a four-year university to complete their undergraduate computer science programs. The majority of our students transfer to either San Jose State University or California State University East Bay, though many of our students transfer to University of California campuses, typically UC Davis, UC Santa Cruz, UC Merced, and UC Irvine.

Because I teach introductory freshman- and sophomore-level courses, I feel it is vital for students to have a strong foundation with basic programming and basic computer science before using generative AI tools, and thus I do not accept programming assignments that were completed using generative AI tools. I admit that I'd have a different, more nuanced stance if I were teaching upper-division or graduate-level computer science courses. I have found that students who rely on generative AI for programming tend to struggle more on exams, and they also tend to lack an understanding of the programming language constructs the generated program used.

With that said, I recognize that generative AI tools are likely to become more powerful and cheaper over time. As much as I don't like this brave new world where students can cheat with even less friction today, we professors need to stay on top of things, and so I will be spending the entire month of June (1/3rd of my summer break) getting up to speed with large language models, both from a users' point of view and also from an AI research point of view.

Whenever my students are wondering whether it's worth studying computer science in light of the current job market and anxieties about AI replacing programmers, I tell them two things. The first thing I tell them is that computers and computation are very interesting things to study in their own right. Even if AI dramatically reduces software engineering jobs, there will still be a need for people to understand how computers and computation work.

The second thing I tell them is that economic conditions are not always permanent. I was a freshman at Cal Poly San Luis Obispo in 2005, when computer science enrollment bottomed out in the United States. In high school, well-meaning counselors and teachers warned me about the post-dot com bust job market and about outsourcing to India and other countries. I was an avid Slashdot reader, and the piece of advice I kept reading was to forego studying computer science and earn a business degree. However, I was a nerd who loved computers, who started programming at nine years old. I even wrote an essay in high school saying that I'd move to India if that's where all of the jobs are going to end up. The only other things I could imagine majoring in at the time were mathematics and linguistics, and neither major was known for excellent job prospects. Thus, I decided to major in computer science.

A funny thing happened while I was at Cal Poly. Web 2.0, smartphones, cloud computing, and big data took off during my undergraduate years. My classmates and I were able to get internships at prestigious companies, even during the economic crisis of 2008-09. Upon graduation, I ended up doing an internship in Japan at a major Japanese tech company and then started a PhD program at UC Santa Cruz, but many of my classmates ended up at companies like Microsoft, Apple, and Google, just in time for tech industry to enter an extended gold rush from roughly 2012 when Facebook went public until 2022 when interest rates started to go up. Many of my classmates made out like bandits financially. Me? I made different choices going down a research/academic path; I still live in an apartment and I have no stock to my name. I have no regrets, except maybe for not getting into Bitcoin in 2011 when I first heard about it.... Though I'm not "Silicon Valley successful", I'm living a much better life today than I was in high school, qualifying for Pell Grants and subsidized student loans to help pay for my Cal Poly education due to my parents' low income.

I still believe in the beauty of an undergraduate curriculum that encourages critical thinking and developing problem-solving skills, as opposed to merely learning industry topics du jour. Specific tools often come and go; my 2005 Linux system administration knowledge didn't cover systemd and Wayland since they didn't exist at the time, but my copies of Introduction to Algorithms by Cormen et al. and my Knuth volumes remain relevant.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: