Hacker Newsnew | past | comments | ask | show | jobs | submit | Rounin's commentslogin

Problems with scaling have been the biggest timewaster in my career:

1. In some large businesses I've worked in, so many people have been hired that some systems and processes have wound up being controlled by entirely different people from the people who need them. So coordination between people and waiting for people who have little to no incentive to do the thing they're being asked to takes up a large part of the working day.

2. In other businesses, a large fraction, or even a large majority of the employees have had no discernible job except to talk and write about the job performed by the few people doing an actual job. So a lot of time in these businesses would be spent dodging meeting invitations, rejecting grand ideas about revolutionizing the business with AI on the blockchain, saying no to "if you could X, that'd be great" and generally reminding people that they're not in charge.

The great thing about these problems is that you're not very likely to have them in a small startup, but if you decide to grow the organization later, you'll need to be very vigilant about how you scale.


Hey, thanks for the reply! Those do seem like tricky problems, and specific to large enterprises.

Hard to put a pin on how those might be solved.

How have these affected your work?


Many projects have taken longer and been more stressful and had worse outcomes than needed. A lot of the work being done hasn't even been intended to deliver any business value, but to provide an opportunity for one or more people to be seen to be doing something. Actual value creation does occasionally take place as well, but more as a happy accident or a side effect than anything else. I'm very glad I'm not a major shareholder in any of these corporations.


There are a couple of subreddits related to this. Perhaps you will find them useful: https://old.reddit.com/r/BuyFromEU/ https://old.reddit.com/r/degoogle/


You should be able to make it think you have another card: export HSA_OVERRIDE_GFX_VERSION=10.3.0 The possible values are said to be: # gfx1030 = "10.3.0" # gfx900 = "9.0.0" # gfx906 = "9.0.6" # gfx908 = "9.0.8" # gfx90a = "9.0.a"


Telling ROCm to pretend that your RDNA 3 GPU (gfx1102) is an RDNA 2 GPU (gfx1030) is not going to work. The ISAs are not backwards-compatible like that. You might get away with pretending your gfx1102 GPU is a gfx1100 GPU, but even that depends on the code that you're loading not using any gfx1100-specific features. I would generally recommend against using this override at all for RDNA 3 as those ISAs are all slightly different.

In any case, the possible values can be found in the LLVM documentation [1]. I would recommend looking closely at the notes for the generic ISAs, as they highlight the differences between the ISAs (which is important when you're loading code built for one ISA onto a GPU that implements a different ISA).

[1]: https://llvm.org/docs/AMDGPUUsage.html#processors


I forgot that there's an "11.0.0" as well. Perhaps others have been added since.


I believe the override for GP's 7600 is 1100 or 11.0.0 as GFX1030 is RDNA2 (6800 XT).


The 7900 models are all 1100, the 7800XT is 1101 and the 7600 is 1102.

See Shader ISA: https://www.techpowerup.com/gpu-specs/radeon-rx-7600-xt.c419...


D. It's quite C-like, but more concise, has a richer standard library, garbage collection, threading, etc. etc.


Perhaps something like this to get just one commit with no large files: git clone --depth 1 --filter=blob:limit=100k


For looking at the commit history rather than the files, apparently one can use git-ls-remote .


Recompressing an already lossily compressed file is almost guaranteed to produce information loss, whereas storage media is getting cheaper and cheaper over time. An 18TB hard disk is now within the budget of many people, and they're likely to get cheaper still.

So if your purpose is to archive these files because they're worth keeping, buying a bigger disk may make even more sense.


I don't consider hardisk now although I have tons of them. I keep multiple copies of those files but it is a pain in the ass to distribute the same backup to different disks simply because their R/W rate is too slow. After transfering files, I run validation program to make sure they are all right. These processes take me a week or so. And I have to do this regularly to ensure errors do not accumulate through time. Therefore now I want SSD but the price is still 4X of HDD per TB.

Slight degradation in quality is not my concern, since ultimately I use realtime upscaling tools to watch them. But I don't know how exactly H.265 affects the quality of a video.

By making the file smaller, I can 1) distribute to other disk faster, 2) validate correctness faster, 3) set a higher redundant rate because now I have more free space.

But the problem is will H.265 become obsolete before it becomes infrastructure. You know AV1 is a better algorithm and companies are pushing it.

Or H.265 is not available in the future due to I don't know royalty issue or something like that?


Retirement?


The "Open Location Code" is often mentioned on Hacker News, but is sadly neither open, nor a location code.

To pick one example, if you go to 0°06'40.6"S 28°56'27.0"E (-0.111271, 28.940829) in Google Maps, it'll give the Open Location Code "VWQR+F8W Maipi, Democratic Republic of the Congo", or some variation thereof, depending on your local language.

The most significant bytes, "Maipi, Democratic Republic of the Congo", are obviously not a location code, but a place name, and thus cannot be decoded at all.

Moreover, if you go to OpenStreetMap and look up "Maipi", it returns three places in Indonesia, and none in DR Congo. So even using a location service plus the algorithm could land you on the wrong continent.

The "Open Location Code" is essentially only usable as a search key for Google Maps. "Go look it up on Google" isn't a location code, it's advertising.


Binary search and similar forms of successive approximation. It can be used to solve such a wide array of problems given just a minimal amount of information.


can you please ELI5?


That sounds more like what a proprietary licence would be used for. You could license both the binaries and the source code under this proprietary licence and provide them to users.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: