Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Amen.

A close family member has spent the past decade going to sometimes extreme measures to avoid an abusive ex-spouse.

Google has potentially pre-emptively revealed their contact information; and not through their own choices and/or lack of action (on zero notice, spending perhaps hours wandering through a plenitude of scattered and poorly documented Buzz settings and behaviors -- please!). No: Any email contact who uses Gmail is now a potential point of exposure for them.

ABSOLUTELY UNACCEPTABLE.

Stupidity can aid and abet evil, Google. If you are not purposefully evil, you are aiding and abetting it.

My trust is gone. It's not coming back.

If you want any measure of damage control, you will determine who was actually responsible for this. And everyone in a position of responsibility for this product who did not understand or chose not to heed these concerns. And you will terminate them. Anything less, and we can do no more than expect similar bad decisions -- from those same people -- in the future.



"And you will terminate them. Anything less, and we can do no more than expect similar bad decisions -- from those same people -- in the future."

I'm not looking to defend Google's choices about Buzz. But I take exception to the idea of firing people who make bad decisions without due consideration for how those decisions came about.

It flies in the face of a mantra at HN: Fail early and often. It goes against the idea that you learn by making mistakes.

If a company fires people for making a poor decision out of ignorance, then they have just lost someone with valuable experience about a potentially troublesome choice. Now that company has to go get someone who (most likely) has not fucked up in that way. (I bet most places do not hire people who got fired from another job for fucking up.)

Who is more likely to make that same or a similar mistake in the future? The person who already fucked up and learned something, or the new person without that experience?

Some people screw up because they are innately incompetent in some field. Let them go; they will not get any better.

Others screw up because they are doing something new, or acting with incomplete or wrong information. In that case, the problem may not be the person but the situation.

Fix the conditions, don't just find a scapegoat.


It flies in the face of a mantra at HN: Fail early and often. It goes against the idea that you learn by making mistakes.

When you're a young, tiny startup, you can fail early and often because the costs of doing so are outweighed by the benefit of the education you get, both to yourself and to society at large.

This is Google. It's huge, it's been around the block a few times, it's already had a ton of failures, and millions of people rely on it. It can't and shouldn't have the same latitude to fail, especially in an area as important as privacy.


"This is Google. It's huge, it's been around the block a few times, it's already had a ton of failures, and millions of people rely on it. It can't and shouldn't have the same latitude to fail, especially in an area as important as privacy."

All the more reason to consider what is the best course to prevent similar mistakes in the future. Knee-jerk firings may make things worse.


"Don't worry boss, the next time I have to make a call about whether or not I'll require millions of users to opt out of sharing some of their most private personal information with the entire world, I'll know just what to do."

Some lessons don't need to be learned from experience, and some screw ups are bad enough that they should result in real consequences.


This situation is not simply a matter of inconvenience. It's a matter of users' safety including physical safety.

As an example of some thoughts on the latter:

http://news.ycombinator.com/item?id=1119173

And it is not a terribly challenging intellectual exercise to realize some of the implications of the product rollout as configured and executed.

Finally, having used and observed Google products for some time now, I perceive it to be yet another in an increasing cascade of decisions and behaviors that have short-shifted legitimate and often apparent security considerations. Yes, that is my opinion. So is my parent comment. Take from it what you will.

First, in my further opinion, there are "mistakes" that are simply too significant to just forgive and learn from. They demonstrate an inability of the parties responsible to carry out their duties and responsibilities. I see Buzz as such a case.

Second, Google as an institution has let various aspects that are exposed by these situations -- from effective design and execution, particularly with respect to some aspects of security, to effective customer relations -- slide for too long. If they are going to improve, it's become apparent that as an institution they are going to have to take some dramatic action. Something that communicates to their employees that the status quo will no longer do.

It's my opinion. And maybe I'm overly pissed off at the moment. But this Buzz rollout is particularly boneheaded, and Google is better off without the employees who had responsibility for preventing such a fiasco.

Google held many customers by making enough of a show about being concerned for their privacy. It's what kept me using their search results, despite the increasing tracking they've been implementing. They are at risk of losing this perception now in the public and upon the part of their customer base.

When your customers don't trust you, your other efforts at engagement run a distant second.


This is really well-considered.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: