Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

a) Scala being a JVM language is one of the fastest around. Much faster than say Python.

b) How large are the 1% of the feeds and the size of the total joined datasets. Because ultimately that is what you build platforms for. Not the simple use cases.



1) Yes Scala and JVM is fast. If we could just use that to clean up a feed on a single box that would be great. The problem is calling the Spark API creates a lot of complexity for developers and runtime platform which is super slow. 2) Yes for the few feeds that are a TB we need spark. The platform really just loads from hadoop transforms then saves back again.


a) You can easily run Spark jobs on a single box. Just set executors = 1.

b) The reason centralised clusters exist is because you can't have dozens/hundreds of data engineers/scientists all copying company data onto their laptop, causing support headaches because they can't install X library and making productionising impossible. There are bigger concerns than your personal productivity.


> a) You can easily run Spark jobs on a single box. Just set executors = 1.

Sure but why would you do this? Just using pandas or duckdb or even bash scripts makes your life is much easier than having to deal with Spark.


For when you need more executors without rewriting your logic.


Using a Python solution like Dask might actually be better, because you can work with all of the Python data frameworks and tools, but you can also easily scale it if you need it without having to step into the Spark world.


But Dask is orders of magnitude slower to Spark.

And you can still use Python data frameworks with Spark so not sure what you're getting.


Re: b. This is a place where remote standard dev environments are a boon. I'm not going to give each dev a terabyte of RAM, but a terabyte to share with a reservation mechanism understanding that contention for the full resource is low? Yes, please.


But can you justify Scala existing at all in 2025. I think it pushed boundaries but ultimately failed as a language worth adoption.l anymore.


Absolutely.

a) One of the only languages you can write your entire app in Scala i.e. it supports compiling to Javascript, JVM and LLVM.

b) It has the only formally proven type system of any language.

c) It is the innovation language. Many of the concepts that are now standard in other languages had their implementation borrowed from Scala. And it is continuing to innovate with libraries like Gears (https://github.com/lampepfl/gears) which does async without colouring and compiler additions like resource capabilities.


I’m sorry but these are extremely weak arguments, and I would contend scale caused more harm than good in all


Scala is still one of the most powerful languages out there.


PySpark is a wrapper, so Scala is unnecessary and boggy.


PySpark is great, except for UDF performance. This gap means that Scala is helpful for some Spark edge cases like column-level encryption/decryption with UDF




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: