It's easy to armchair quarterback these things, and in retrospect, the actions that innocent people should take are probably obvious. At the time I don't think it would be so easy.
There is lots of pressure not to take action, because of the feeling you're overreacting, because you've had things explained to you in a way that minimizes or removes the criminality, and because your job is at stake.
And crucially there is never some black and white issue. If your employer told you to murder someone, it would be easy to say no and know you did the right thing. If they tell you to incrementally go along with some grey area thing you're not sure the legal status of, it's way harder to know what to do.
People still have to be accountable for their actions of course, ignorance is no excuse. But we all should hope we're never in such a situation to begin with rather than thinking we'll know how and when to act.
The article makes it sound clear sure. But then the article has been edited.
I would not have been surprised if the 5 million user thing was couched as some sort of "we need to generate some realistic test data to load test our systems <WINK WINK> - please create 5 million accounts very similar to these paying ones, remember this is testing so they need to be as realistic and believable as possible <WINK WINK>".
If I got that request (perhaps without the winking!) come down the line through the usual channels I'd probably have gone along with it without realising it was for anything nefarious. ...but then would that be a viable defense?!
I think this skips over an important fact from the article - the head of growth + CEO were in the room making this request, then the eng director raised concerns, then they assuaged his concerns by saying it's ok for 'investor purposes'.
I can see the situation you're describing, sorta. Though if it was me and someone asked me to generate a list of 5 million real-ish user accounts in a report, I'd immediately ask why. If it's to commit fraud or lie to investors, I would be like hell no! If we're doing load testing or something legit, for sure. But I feel like benign use-cases of generating 5 million accounts would not include the "make it look real" aspect.
I also don't think the Reddit comparison makes sense, since Reddit didn't seek to sell the company at the time based on the # of users. Growth hacking is one thing, lying to investors about users is another. Because this data point was a key decision factor for a financial transaction, this fake information/lie becomes fraud.
Even if somebody gave no pretext, I don't think that, in and of itself, is illegal. Though it could be used for illegal things. For instance early on Reddit actively created fake accounts, fake votes, fake comments, and all other sorts of stuff in the process of trying to reach critical mass. I really doubt that was illegal.
OTOH if somebody sent a message saying, 'Hey we need to increase our apparent paying users in order to defraud some potential investors.' then obviously you've become part of a criminal conspiracy, but I think nobody would ever* overtly say that.
I think there is a big difference between faking 10k users and then going to investors at 1m users years later (it's a morally dubious kickstart) or in this case for the sake of the sale/investment going to 1400%.
I can only imagine someone with a family to feed who is tied to corporate health insurance, or an H1 visa, being coerced into some gray activities being unwilling to lose their job to remain ethical or legal.
One of many reasons employers have a quiverful of ways to exploit and control workers.
If you're serious about anything, you do more than hope. You do diligence on your prospective employer before going to work for them. You think through a litany of contingencies and prepare a plan of action for each. Jobs in this industry are uniquely amenable to this by virtue of their relatively higher compensation and the autonomy often afforded to employees. If you spend an hour every day on HN, you can spend an hour meditating upon your conscience.
Predicting one's response to stressful and unexpected circumstances is hard. So try to anticipate circumstances and cultivate relevant virtues in advance.
There is lots of pressure not to take action, because of the feeling you're overreacting, because you've had things explained to you in a way that minimizes or removes the criminality, and because your job is at stake.
And crucially there is never some black and white issue. If your employer told you to murder someone, it would be easy to say no and know you did the right thing. If they tell you to incrementally go along with some grey area thing you're not sure the legal status of, it's way harder to know what to do.
People still have to be accountable for their actions of course, ignorance is no excuse. But we all should hope we're never in such a situation to begin with rather than thinking we'll know how and when to act.