Hacker Newsnew | past | comments | ask | show | jobs | submit | vitriol83's commentslogin

this seems to be the way. make great technical improvement in a way that's nothing to do with AI. the only way to make executives happy is to then tenuously link it to AI usage.

In my field which involves large legacy codebases in C++ and complex numerical algorithms implemented by PhDs. LLMs have their place but improvements in productivity are not that great because current LLMs simply make too many mistakes in this context and mistakes are usually very costly.

Everyone `in the know' appreciates this, but equally in the current environment has to play along with the AI hype machine.

It is depressing, but the true value of the current wave of LLMs in coding will become more clear over time. I think it's going to take some serious advances in architecture to make the coding assistant reliable, rather than simply scaling what we have now.


are there any tools to convert large latex documents to typst ? it looks a huge improvement, but the migration path is the only thing that's stopping me.


Pandoc [1] can convert LaTeX to Typst.

[1]: https://pandoc.org/


In places where there is not much time for code refactoring, the following is helpful:

Imagine an idealised future state of the codebase, which everyone buys into, and make sure any new feature is going in that direction.

Refactoring existing code can be death by a thousand cuts- having a parallel new codebase which is incrementally adopted can be more efficient and quicker to market.


This is fairly obscure but the problem he highlights can be overcome easily by localising at the saturation of S_f, for D(f)=D(g) if and only if the saturations are equal, and localising at saturation is an isomorphism


It seems though that we would want the computer to be able to do this kind of reasoning itself, and not rely on humans "pre-resolving" all such problems.


I understood the presentation to be on the distinction between equality and isomorphism and when an isomorphism "is an equality". From the slide on homotopy type theory I get the impression that he finds it unsatisfactory to simply consider all isomorphisms as equalities.

But I might have misunderstood your objection?


So many times it’s necessary to ‘identify’ two more objects which are isomorphic, and the ‘canonical’ is supposed to justify why this doesn’t cause a problem.

The reason it is necessary here is that the mapping D(f) -> A_f is a priori multi valued, and we need it to be single valued.

However it’s fairly easy to make a definition which is trivially single valued , so I don’t think it’s a very instructive example of the phenomenon.

Probably more pertinent is an n-fold tensor product with different bracket ordering


Stacks, like EGA before it, is a wonderful reference but a terrible textbook. I think even the authors would agree with this! Fortunately there are many other books from which to learn algebraic geometry, after which Stacks will start to make a lot more sense.


Modern Algebraic Geometry is indeed highly abstract, but generally the conjuring of obscure objects is with a specific goal in mind, for example

- consolidation of many types of results into a `simple' theoretical framework, I suppose this originates with Noether, and reaches it apotheosis in Bourbaki's tracts.

- embedding of 'classical' objects (solutions to polynomial equations) inside a larger 'category' (schemes) where certain mysterious relations observed in the classical world (Weil Conjectures) have a more `natural' interpretation (fixed point theorem) and light the way to a proof which would have otherwise been beyond reach


I’m generally positive on rewriting openssl in rust, but agree the comparisons aren’t completely scientific or necessarily more important than correctness. First you should compare performance using the same implementations of cryptographic primitives as these are relatively easily fungible. We also don’t know if for example OpenSSL optimised primitives were used. Secondly the rust language only excludes memory bugs, it doesn’t exclude errors in the implementation of the tls protocol or incorrect usage of cryptographic primitives which can be just as catastrophic for security. These have been prevalent in OpenSSL and are somewhat harder to prevent a priori. For all we know these issues are worse for rustls than OpenSSL. This is where formally verified implementations would be useful.


> Secondly the rust language only excludes memory bugs, it doesn’t exclude errors in the implementation of the tls protocol or incorrect usage of cryptographic primitives which can be just as catastrophic for security.

I linked to a talk about the project elsewhere and it's worth noting that the author of rustls leverages a lot of rust techniques that ensure certain correctness attributes at a semantic level, not just memory safety.

In particular, TLS libraries have long suffered from dealing with the complex composite state machines required by the protocol[0]. Rust makes the expression of safe state machines pretty easy (the talk demonstrates how).

[0]https://www.mitls.org/pages/attacks/SMACK


one issue with OOP in practice that I've seen is the entanglement of the domain representation (member variables of a class) and the varied operations on that data (methods). Classic OOP encourages you to manipulate objects by methods rather than free functions, which combines potentially unrelated functionality in the same object.

my rule of thumb with objects is to keep the methods to a minimum, to the extent all classes are either interfaces, implementations of interfaces or pure data classes. obviously this approach will be natural to ML programmers.


The difficulty with learning 'modern' algebraic geometry is not only is it very dense and general, but that means the original motivation can become lost.

So I think understanding Weil conjectures are key for modern algebraic geometry. And it's always easier to understand algebraic curves (algebraic geometry with dimension 1) and their connection to Riemann surfaces (algebraic curves over the complex numbers with analytic rather then algebraic structure), as they provide motivation for many of the results and constructions.

A good introduction to Algebraic Curves and the Weil conjectures I've found is following

https://math.mit.edu/~poonen/papers/curves.pdf

For general algebraic geometry, JS Milne's notes are rather good

https://www.jmilne.org/math/CourseNotes/ag.html

and for an introduction to commutative algebra Atiyah-Macdonald's book is great.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: