Hacker Newsnew | past | comments | ask | show | jobs | submit | reggieband's commentslogin

I see some asking for stack support in the logs, and that is something I would be looking for. Often when an error happens on a server it might be inside some generic utility. At least with a stack, you might realize that all of the calls to that utility function are coming from the same location a few steps up the stack. In my experience that can significantly speed up identifying the root cause of issues.

I'm not sure I like the way they are handling attributes either. One pattern I strongly discourage on projects I work on is formatting data into the log message. When I've had access to decent log aggregation tools (e.g. Splunk, Datadog) I like to generate reports on error messages. That is usually possible with regex magic but in my experience it is much much cleaner to have a static unchanging message for all messages. I then store all additional details in a separate part of the log line. I tend to prefer JSON for those details since it is well supported and even parsed by default by many tools. Another benefit of JSON for those additional details is that it is supports arrays and hash-tables. The attribute system proposed here doesn't seem to have that same hierarchical attribute support.

IMO, unified, clean and consistent logging is one of the most underrated techniques for maintaining long running services.


This is what annoys me in my current job - every service uses unstructured logging. We use a couple popular loggers in Go and still, people put values in messages rather than just make the message static and put all variables into, well, variables.


These days I kind of wish we didn't do text logging at all. There's essentially two types of logs: fixed messages, and values.

A binary logging protocol can make this definition explicit and more efficient: you're either emitting a token for the fixed message in your applications format, or you're emitting values...and a fixed token for the format string, which can be parsed and reconstructed later.

Our applications don't need to be doing this sort of text formatting in them at all, and we need the developer interface to make explicit what's happening and what's important.


All I ever want to know is what line of code in MY code is in the stack trace. Maybe I’m doing it wrong but this is a regular pain point for me.

Sometimes I’ll even get Python or JS traces that don’t even list a single line of my own code.


I think a lot of people want what you mentioned. But I also think that even if stack support is implemented, they will "keep it simple" and pass the responsibility of extracting the location/details of your code from the stack trace to, you. That will give rise to everyone writing there own variety of utility/common stack parsing methods and then libraries will spring-up that provide such conveniences and it'll go on to become its own thing.

I feel like that is Go's underlying theme: "we don't provide last-mile-connectivity; how else will the ecosystem grow?!"


Isn't letting the log library do the formatting a common practice? That way you can get formatted logs and the library can cluster logs by the format string instead of the final output.


I don't care about the formatting, I care about the available data types. JSON is just easy to use since it includes arrays/hash-tables and it is already supported by a large number of tools.

If you look at the Attr type definition you can see they support many primitive types including Int, Float, Bool, String, Duration, and Time. So if I had a record type, like a customer I could do:

    slog.LogAttrs(slog.ErrorLevel, "oops",
        slog.Int("customer_id", 123456), 
        slog.Float("customer_balance", 12.42)
        slog.Time("customer_updated_at", ...))
But I would prefer a structured data type there. Something more like:

    { "customer": {
        "id": 123456,
        "balance": 12.42
        "updated_at": ...
        }
    }
In fact, I'm not sure how I'd go about supporting arrays as Attr except with keys like "item_0", "item_1", etc. Or maybe serialize it into a representation like comma separated values. But now I'm coupling my log to a custom serialization/deserialization - I'd rather just use JSON once again since most tools will know how to handle it.


I am reminded of the story of the fat cat that I recently read in a Brothers Grimm collection. In case it is not familiar to you:

A cat and a mouse decide to team up and store up some surplus food for hard times. They agree during abundant times to both put a little extra fat into a shared pot. They then store the pot in a safe place. The cat, however, is overcome by hunger even in the abundant times. Every few weeks the cat sneaks to the pot and takes just a little bit, maybe just a few licks at a time. Over the course of a couple of years of sneaking and small stealing the cat manages to completely deplete the pot. Then hard times hit, as both expected. The cat and the mouse both go to the pot together to divide their savings. But when they lift the lid they both gasp in surprise that the pot is empty. The mouse slowly realizes what must have happened. The cat is guilty but won't stand being accused and so it turns on the mouse and eats it.

My whole life I had heard of the rich upper-classes described as "fat cats" but I didn't really understand the message until I read the story. From at least 2008 we were supposed to be saving for the next crisis. Yet here we are at the next crisis and low and behold the pot is empty.

The reason we fall for this over and over again is that the short-term memory of civilization is shorter than the cycles these things happen over.


I read something slightly different into your metaphor: it's about pro-social, positive sum games vs zero/negative sum games.

The mouse is participating in the positive sum game of joint savings, and the cat is robbing it blind (very negative sum as it's also robbing its future self).

There's a lot of that happening across society (healthcare, education, addiction-fueled tech companies, ...), and the people who do it oftentimes do end up (at least short term) better off (= rich), but we as a nation end up poorer (the whole negative sum thing).

There are however plenty of ways to become rich doing positive sum things (though lord knows it's harder than it looks), so there will be rich people who don't deserve the blame so to speak.

But the clearly visible, media-amplified cases will of course be cartoonish villains, all too eager to sell you the rope to hang them with so to speak.


How are „the rich“ bringing the crisis about, in your opinion?


Are you talking about the budget deficit? It’s not really like a pot you only look into when times are hard since it’s constantly known.

I really like your metaphor, but it is misleading.


It's not the budget. It's the hidden cuts to services, the declining maintenance to infrastructure, the privatization of essentials. Those are the licks they take from the pot, while they tell you the pot is still full.


Exactly! Shrinkflation, wage suppression, union busting, dynamic pricing of medications, etc -- lick, lick, lick!

[1]https://etc.usf.edu/lit2go/175/grimms-fairy-tales/3068/cat-a...

(edit, this isn't for the UK, but fat cats abound!)


Apologies but what money is spent on in the UK is public information —- it is part of the budget. The fat cat and the hoodwinked mouse is a poor metaphor.


The pot in the parable is also public information - the mouse could have just opened the lid every now and then to check the levels.

It doesn't matter how public it is. It matters whether the public actually knows about it and understands it.


The money to pay for those things would have to come from you, the people. They are not taking it away, there is nothing to distribute.


Profit of corporations is the stole surplus value of the workers. And there's been plenty of it


The world needs more guillotines


Eh, we have something similar, the problem seems to be we’re not putting those who deserve it in them.


Eh, sorry. Under Anglo-Saxon neo-feudalism (UK, US, others), there is a widespread grant of sovereign immunity to sizable swathes of the monied classes.


I could write an entire blog post on my opinions on this topic. I continue to be extremely skeptical of TDD. It is sort of infamous but there is the incident where a TDD proponent tries and fails to develop a sudoku solver and keeps failing at it [1].

This kind of situation matches my experience. It was cemented when I worked with a guy who was a zealot about TDD and the whole Clean Code cabal around Uncle Bob. He was also one of the worst programmers I have worked with.

I don't mean to say that whole mindset is necessarily bad. I just found that becoming obsessed with it isn't sufficient. I've worked with guys who have never written a single test yet ship code that does the job, meets performance specs, and runs in production environments with no issues. And I've worked with guys who get on their high horse about TDD but can't ship code on time, or it is too slow, and it has constant issues in production.

No amount of rationalizing about the theoretical benefits can match my experience. I do not believe you can take a bad programmer and make them good by forcing them to adhere to TDD.

1. https://news.ycombinator.com/item?id=3033446


> tries and fails to develop a sudoku solver and keeps failing at it

But that's because he deliberately does it in a stupid way to make TDD look bad, just like the linked article does with his "quicksort test". But that's beside the point - of course a stupid person would write a stupid test, but that same stupid person would write a stupid implementation, too... but at least there would be a test for it.


Huh? Ron Jeffries is a champion of TDD (see for instance https://ronjeffries.com/articles/019-01ff/tdd-one-word/). He most certainly wasn't deliberately implementing Sudoku in a stupid way to make TDD look bad!


Top-most comment to the link you provided pretty much explains the situation. TDD is a software development method, not a generic problem solving method. If one doesn’t know how a Sudoku solver works, applying TDD or any other software development method won’t help.


One of the theses of TDD is that the tests guide the design and implementation of an under specified (e.g. unknown) problem, given the requirements regarding the outcomes and a complete enough set of test cases. “Theoretically” one should be able to develop a correct solver without knowing how it works by iterative improvements using TDD. It might not be of good quality, but it should work.

Note: I am quite skeptical of TDD in general.


I don't really use TDD, but I've never heard that TDD would help guide the implementation. I always understood it was about designing a clean interface to the code under test. This being a result from the fact that you are designing the interface based on actual use cases first since the test needs to call into the code under test. It helps avoid theoretical what ifs and focus on concrete, simple design.

Personally I think that one can learn this design methodology without TDD. I find learning functional programming and say Haskell/OCaml/SML/etc.. far more beneficial to better design here than I do TDD.


It’s both.

In theory TDD drives the interface by ensuring the units under test do what they’re intended (implementation), and that each and every unit is “testable” (interface).

TDD doesn’t really care about “clean” interfaces, only that units of work (functions, methods) are “testable”.

I’d argue this actually creates friction for designing clean interfaces, because in order to satisfy the “testability” requirement one is often forced to make poor (in terms of readability, maintainability, and efficiency) design choices.


>I've worked with guys who have never written a single test yet ship code that does the job, meets performance specs, and runs in production environments with no issues.

I'm curious to unpack this a bit. I'm curious what other tools people use other than testing programatic testing; programatic testing seems to be the most efficient, especially for a programmer. I'm also maybe a bit stuck on the binary nature of your statement. You know developers who've never let a bug or performance issue enter production(with or without testing)?


Originally when I started out in the gaming industry in the early 2000s. There were close to zero code tests written by developers at that time at the studios I worked for. However, there were large departments of QA, probably in the ratio of 3 testers per developer. There was also an experimental Test Engineer group at one of the companies that did automated testing, but it was closer to automating QA (e.g. test rigs to simulate user input for fuzzing).

The most careful programmers I worked with were obsessive about running their code step by step. One guy I recall put a breakpoint after every single curly brace (C++ code) and ensured he tested every single path in his debugger line by line for a range of expected inputs. At each step he examined the relevant contents of memory and often the generated assembly. It is a slow and methodical approach that I could never keep the patience for. When I asked him about automating this (unit testing I suppose) he told me that understanding the code by manually inspecting it was the benefit to him. Rather than assuming what the code would (or should) do, he manually verified all of his assumptions.

One apocryphal story was from the PS1 days before technical documentation for the device was available. Legend had it that the intrepid young man brought in an oscilloscope to debug and fix an issue.

I did not say that I know any developers who've never let a bug or performance issue enter production. I'm contrasting two extremes among the developers I have worked with for effect. Well written programs and well unit tested programs are orthogonal concepts. You can have one, the other, both or neither. Some people, often in my experience TDD zealots, confuse well unit tested programs with well written programs. If I could have both, I would, but if I could only have one then I'll take the well-written one.

Also, since it probably isn't clear, I am not against unit testing. I am a huge proponent for them, advocating for their introduction alongside code coverage metrics and appropriate PR checks to ensure compliance. I also strongly push for integration testing and load testing when appropriate. But I do not recommend strict TDD, the kind where you do not write a line of code until you first write a failing test. I do not recommend use of this process to drive technical design decisions.


You know developers who've never let a bug or performance issue enter production(with or without testing)?

One of the first jobs I ever had was working in the engineering department of a mobile radio company. They made the kind of equipment you’d install in delivery trucks and taxis, so fleet drivers could stay in touch with their base in the days before modern mobile phone technology existed.

Before being deployed on the production network, every new software release for each level in the hierarchy of Big Equipment was tested in a lab environment with its own very expensive installation of Big Equipment exactly like the stations deployed across the country. Members of the engineering team would make literally every type of call possible using literally every combination of sending and receiving radio authorised for use on the network and if necessary manually examine all kinds of diagnostics and logs at each stage in the hardware chain to verify that the call was proceeding as expected.

It took months to approve a single software release. If any critical faults were found during testing, game over, and round we go again after those faults were fixed.

Failures in that software were, as you can imagine, rather rare. Nothing endears you to a whole engineering team like telling them they need to repeat the last three weeks of tedious manual testing because you screwed up and let a bug through. Nothing endears you to customers like deploying a software update to their local base station that renders every radio within an N mile radius useless. And nothing endears you to an operations team like paging many of them at 2am to come into the office, collect the new software, and go drive halfway across the country in a 1990s era 4x4 in the middle of the night to install that software by hand on every base station in a county.

Automated software testing of the kind we often use today was unheard of in those days, but even if it had been widely used, it still wouldn’t have been an acceptable substitute for the comprehensive manual testing prior to going into production. As for how the developers managed to have so few bugs that even reached the comprehensive testing phase, the answer I was given at the time was very simple: the code was extremely systematic in design, extremely heavily instrumented, and subject to frequent peer reviews and walkthroughs/simulations throughout development so that any deviations were caught quickly. Development was of course much slower than it would be with today’s methods, but it was so much more reliable in my experience that the two alternatives are barely on the same scale.


I think this whole failed puzzle indicates that there are some problems that cannot be solved incrementally.

Peter Norvig's solution has one central precept that is not something that you would arrive at by an incremental approach.

But I wonder if this incrementalism is essential for TDD.


It's such a tiny thing, but my pet peeve is exclamation marks in error messages.

"Something went wrong!" is something I see a lot of engineers do. I don't usually call it out but it bugs me whenever I see it.


TikTok scares me in the same way that DALL-E scare me.

DALL-E is very, very good, but still obviously flawed. But behind the flaws I know it will get better. Like the original iPhone vs. the crazy powerful phones we have now. Or the original 3d games like Doom or Duke Nukem 3d compared to modern PC graphics. Or the insane quality of modern movie CGI. In 5 years or 10 years this AI tech to generate incredible images will advance at a similar pace. (Shout out to GPT-3 in this space as well)

I don't think we fully understand how TikTok is connecting to our subconscious but it is doing it in a way that most people agree is unsettling. And we are at the very beginning of this journey. There is no way to predict if this is going to be good or bad without us doing it. And every company now realizes that it is both easy to achieve and tremendously effective.


> how TikTok is connecting to our subconscious

it's the same channel surfing that people used to do with TV, only more with more channels and better timing, plus a content recommender.

the mechanism is the same dopamine drip. there's a chance to see something that's exciting/interesting/relevant/pleasing/arousing/etc.

coub is almost the same (but a lot smaller, caters to an Eastern European audience mostly, plus has a baseline video+music remix content)


> DALL·E: Creating Images from Text

https://openai.com/blog/dall-e/

for anybody else out of the loop


This makes me wonder if a future job description will be the equivalent of an AI whisperer. Someone who learns how to prompt AI so well that it becomes their job.


Well, "devs" are doing that for the past 15 years, but using Google (without this pre-requisite in the job description tho)


Future programmers will write prompts for a future version of Github Copilot: "database layer with all the usual CRUD operations, in readable modern C++, code compiles and passes tests, test cases carefully written by computer scientists with great attention to detail".


You forgot to add the magic incantations like "50k stars on GitHub, copyright Uncle Bob, maintained by airbnb"


I’ve been playing with tech like this for over a year now. It definitely requires getting to know the AI to get good results. The tech gets better fast and makes old techniques obsolete. But, the gap between beginner and expert prompters stays large. As silly as that sounds to read back :D


To an extent. But I think it'll get baked into existing jobs. A bit like how "computer skills" or the ability to write good Google queries ended up as part of regular clerical work.


This is absolutely the future, but it's not going to be obscure. You won't have a job if it isn't this or physical.

AIs are going to replace entry level creatives, and experienced users with taste will largely perform selection and the development of good starts to mature designs. And I mean all creatives. Engineers, architects, mathematicians, programmers.


Mathematicians, doubtful. Someone must do some verification of the AI results.

Programmers, also doubtful. You still need to design some APIs and interact with them. Interacting with an AI using natural language might become possible, but it definitely won’t be as efficient as more structured languages. (E.g., writing an algorithm in actual code is often much easier than teaching it to a human.)


I really like your last sentence. It's exactly the thing to respond to people saying "AI will automate programmers"


It's exactly that verification of results and development from snippets and starts into a complete whole that will be the purview of human work. The AI will be replacing (or, rather, preventing the creation of positions for) juniors.


As described in this other thread [1], there are already people doing it freelance.

[1] https://news.ycombinator.com/item?id=32324723


I can still make an AI replacement for that guy. An AI prompter of AIs.


I once interviewed with a crypto exchange and after getting the offer I noticed it was for contract and not full-time as had initially been offered. I went back to them and said I had a similar compensation offer for full-time from another company and if they wanted me to consider contract with them then they would have to raise the offer. Their response was an offer to pay me directly in bitcoin.


Lmao. This is next level BS. How can we take crypto seriously when they pull stuff like this?


I love this idea and I could see myself using it in 1:1 meetings with my direct reports.

One of the advantages of getting older is the experience one gains. It is often the case that I can relate a current situation to one that has occurred in my past. This can help me make good decisions by anticipating expected similar outcomes to the most obvious approaches.

Yet the other side of this is I can find myself droning on about old war stories to junior engineers. What I mean to be well-intentioned advice can turn into a monologue. This is especially true when I am giving advice "off-the-cuff" in response to situations brought up in 1:1 meetings with direct reports. I might struggle finding the best way to communicate my experience. This can cause me to rephrase the same idea several different ways in an attempt to clearly convey my thought.

I feel this kind of tool could help me focus more on listening to the people I manage.


> Yet the other side of this is I can find myself droning on about old war stories to junior engineers.

Middle-aged here – I've been in that situation as well.

> What I mean to be well-intentioned advice can turn into a monologue.

+1 – So well-intended ("So they don't make my stupid mistakes") but hard to really take in.

> I feel this kind of tool could help me focus more on listening to the people I manage.

Please, if you do, let me know if it helps.

Also, if those are personal 1:1 you could always just call it out and ask them to give you live feedback tech-free (?)


I don't know if it is the same word but it reminds me of a joke a Brazilian friend told me when we were discussing how it can be hard to translate jokes. He mentioned that someone who shits a lot is a common insult in Brazil.

The joke goes: A young man visits the family of his girlfriend for the first time. At one point the father of the girl pulls him aside and tells him that he is no good for his daughter. The father wants a rich man for his daughter. He tells the young man "you can't buy my daughter a nice house, you can't buy her a nice car, you can't buy her nice clothes, I bet you can't even afford to buy her toilet paper". Later, as the young man and girl drive home she notices he is upset. She asks if her dad maybe said something to him. He said, yes, he told me you are a big shitter.


I know this is a crazy thought and not inline with a lot of the blizzard hate - but why don't games like this allow more gifting style mechanics?

I consider the Twitch gift sub market. And I consider that weird crypto game Axie Infinity. It seems we are only considering the purchasing power of whales that want to flex. What about whales that want to collaborate. Whales that want to demonstrate their generosity.

I remember a reddit post about a guy who had a rich friend that would constantly take him on wild adventures that the poorer friend couldn't afford. The rich guy didn't care about spending the money and the poor guy was good company.

I think an interesting monetization mechanic would be a game where the pay-to-win aspect was in the recruitment and outfitting of other players. So as a poor player I could have the opportunity to earn high-level gear based on the spending of a whale whose campaign I was joining.


This was one thing I appreciated about Pokemon Go! when it first came out and everyone was playing it. It's one of the only free-to-play games that got me happily buying the in-game currency to buy items, because I could buy and use the (don't remember what it was anymore, too long ago, I'm sure someone else here remembers) item that attracted Pokemon to my area for a half hour at a time, and all players in the region benefitted.

So I was buying and deploying these things over and over again just so my wife and kids in the area could have more fun. I specifically remember we attended a wedding at Disney World and my wife (girlfriend at the time) couldn't move around because of a recent surgery (I pushed her around Disney World in a wheelchair the whole time), so me buying those things let her capture Pokemon from our hotel room, and I could see a bunch of other people in the area playing as well. I hope I helped several kids enjoy playing the game more during that time.

I'd love more games to have something similar. If something only benefits me in a game I don't feel too motivated to buy it, as I can usually just deal without and progress slower, of just play a different game altogether.

It doesn't have to be specific to your physical region (although I think that helps, it was cool to think I was helping people near me), but maybe just where you're at within a game world, or something.


You end up with a huge security incentives problem. If there's a way to extract the value from one account and give it to another, you massively increase the incentive to phish users, set up black markets for buying items off of a hacked account, or just transfer it to your own.


You could set up a payment system to buy items that's not a player account. You buy an item just to give. There's still a phishing incentive, but is it worse than any of the other content you can buy online?


Credit Card theft/fraud is why. If it was possible to "transfer" funds like this then mechanisms for fraudulent purchases will increase dramatically. Eve Online introduced purchasing in-game currencies because it was rife on the "black market" anyway and they also saw a huge number of fraudulent purchases and/or charge backs. 3rd party grey "broker" sites popped up with way below market rate prices, 99% of which are using fraudulent CCs to fund them, and bot accounts to transfer the in-game stuff.


Why not steal the idea of Pokémon go? Transfer doesn’t go to a person but to an area or group of people. You could pay to unlock an area and take a group of people on a quest. Or perhaps a boost or buff in an area.


If you can't gift items, it seems like a huge missed opportunity because I feel like there's a lot of things people don't buy for themselves but they might be willing to buy for other people as a gift.

For instance I don't think I would ever spend 30 dollars on a bottle of wine to drink at dinner, but I could pay 50 or 60 to buy a bottle of wine as a special gift to a friend.


Whoa, very interesting! i'd love to play this game.

> spending of a whale whose campaign I was joining.

Also, for me very punishing mechanics with high risk, high reward games have had an underserved market place. I love pking on a variety of different MMO's and sometimes when you are pking in max gear you are risking 10g's IRL. Full Torva on runescape is 6g's and once you add up the rest of the rings and amulets, you can easily be risking $15,000.00 IRL money.

However, there is no benefit to anyone besides yourself. Clans are 'superficial' as there is no real 'shared loot' system, clan building requirements outside 'daily's', but being the biggest clan doesn't give you 'benefits' outside a few dailys and some xp.


When I was 11 some guy randomly gave me like 20k gp and dragonhide in RuneScape and I haven't forgotten that moment in the mines outside of Varrock since.


There are a few F2P games which offer a wishlist/gifting mechanic. (I know Path of Exile does).

Unclear how successful it would be.


Even Path of Exile only allows gifting microtransactions through explicit interactions with support, and they're somewhat picky about verifying that you're buying the items for someone you know personally -- they're going out of their way to avoid letting players trade in-game equipment for paid cosmetic items.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: