This needs an index and introduction. It's also not super interesting to people in industry? Like yeah, it'd be nice if bindless textures were part of the API so you didn't need to create that global descriptor set. It'd be nice if you just sample from pointers to textures similar to how dereferencing buffer pointers works.
It's actually more efficient to do a hybrid approach, especially at high view distances. Rasterizing triangles is extremely fast, and is basically a perfect primary-ray intersection. Ethan Gore recently did some experiments with raytracing and said that for large scene volumes (his engine comfortably renders the entire 32-bit range, or 4B^3) it turns out to be faster to do raster for primary rays and raytrace shadows/GI.
I've always wondered why voxel engines tend to produce output that looks so blocky. I didn't realize it was a performance issue.
Still, games like "C&C: Red Alert" used voxels, but with a normal mapping that resulted in a much less blocky appearance. Are normal maps also a performance bottleneck?
I originally chose to go with axis-aligned blocks and hard axis-aligned normals because I liked the aesthetic. I've since slightly course-corrected; each voxel has bent normals which follow the surface. How much the normals are bent is artist configurable. This has the effect of smoothing out the look of the surface when viewing from a distance, but still gives the distinct blocky look when up close.
In terms of performance, there is a cost to having fully 3D normals per voxel, but it's certainly manageable. There's a lot of other, more expensive, stuff going on.
Before Minecraft, basically all voxel engines used some form of non-axis-aligned normals to hide the sharp blocks. Those engines did this either through explicit normal mapping, or at the very least, by deriving intermediate angles from the Marching Cubes algorithm. Nowadays, the blocky look has become stylish, and I don't think it really even occurs to people that they could try to make the voxels smooth.
Voxels have been around since the 1980s. The smoothness came from that beautiful CRT and its inability to display crisp images. Normals weren’t really used until early 90s and used heavily by games like Comanche by NovaLogic.
The reason why Minecraft voxels are blocks is because Notch (Markus Persson) famously said he was “Not good at art”. He didn’t implement the triangulation and kept them unit blocks. Games that had voxels AND were triangulated that came before Minecraft were Red Faction, Delta Force, Outcast just to name a few.
The point is, voxels aren’t anything special, no more than a texel, or a vertex, or a splat, a normal, or a uv. It’s just a representation of 3D space (occupied/not occupied) and can just as easily be used for culling as it can for rendering. The Minecraft style because popular because it reminds people of pixels, it reminded people of legos, and Minecraft was so popular
It depends on how the voxels relate to the gameplay.
Regardless of the original intent, in Minecraft the voxel grid itself is a very important aspect of the core gameplay loop. Smoothing the voxel visual representation disguises the boundaries between individual logical voxels and makes certain gameplay elements more difficult or frustrating for the player. When the visuals closely (or exactly) match the underlying voxel grid, it's easy for the player to see which specific voxel is holding back some lava or if they're standing on the voxel they're about to break.
In Minecraft you can, for example, clearly count how many voxels wide something is from a distance, because the voxels are visually obvious.
In Red Faction, you're never concerned with placing or breaking very specific voxels in very specific locations, so it's not an issue.
So your point is, Minecraft uses voxels on a unit scale. Red faction uses voxels differently, so Minecraft wins?
I get the appeal of Minecraft but Notch didn’t invent this stuff as much as you would love to believe. He simply used it in a way that made it easy to understand. To the point where people like you are explaining it to me like I have never played it. I have. I was one of the first testers.
Almost all of Minecraft is ripped off other games. World creation, dwarf fortress. Mining, dig dug. The only original thing was The Creeper.
This seems like a needlessly antagonistic response? GP was only pointing out that the voxel shape is fundamentally important to Minecraft. It's not just a matter of Notch's artistic talent, as you said.
Anyway I don't think anybody is saying Notch invented this stuff or Minecraft was the first to do certain things. But it's probably worth pointing out that, ripped off or no, those other games haven't become remotely close to the popularity of Minecraft, so Notch clearly did something right... maybe the Creepers are why?
I don't think this should be understated. LEGO are easy and fun to build with and don't require a lot of artistic talent. The same goes for block-based games like Minecraft.
Can you add more details? This seems to directly contradict GP. GP said ray tracing can do higher voxel counts = ray tracing is more performant (than rasterization).
Of course not, but unlike chat services, everyone has an email address or phone number, so if I need to reach out to them for something other than a casual chat, e.g. an invitation, a birthday felicitation, or a document that they should review later, email is a method through which I can reach all of them.
Similarly, reaching out to companies for support also often happens over email.
What's your specific concern here? I certainly wouldn't want to, e.g., give young kids unmonitored use of an LLM, or replace their books with AI-generated text, or stop directly engaging with their games and stories and outsource that to ChatGPT. But what part of "generate fun images for my kids - turn photos into a new style, create colouring pages from pictures, etc" is likely to be "unhealthy and bad for their development"?
Why do people pay for ai tools? I didn't get that. I feel like I just rotate between them on the free tiers. Unless you're paying for all of them, what's the point?
I pay for Kagi and get all of the major ones, a great search engine that I can tune to my liking, and the ability to link any model to my tuned web search.
having open source gpu runtime, from api to metal, would be nice. but as you can see, the real meat of the business (the compiler) will probably never-ever be open sourced for internal political reasons. which is the most interesting component.
it must be said the gpu crowd is very different to the cpu crowd. the cpu crowd trips over themselves to tell everyone about their ISA. the gpu crowd keep it very close to their chest. (the gpu isas are also quite batshit insane but thats another conversation.) you wind up with almost polar opposite experiences in terms of how these two groups interact with the broader industry.
gpu people reeeaally don't want you prying your nose beyond the userspace APIs, in my experience.
EDIT - to add though... that is kind of understandable, because the gpu crowd is under a lot more pressure to support basically everything and everyone. opengl, dxil, spirv, opencl - and the dense matrix of combinations. I often see people hate on Apple for developing their own API (Metal), but honestly? I totally get it. in retrospect they probably did the right thing.
we have an epidemic of gpu programming "specs" and "standards", with no end in sight. i can't help but keep editing this comment with more of my hot takes: ofcourse nvidia owns the GPU programming space. they're the only ones offering anything resembling an actual platform! and they will continue dominating the space, because the rest of the players are too busy playing in their own ballpens.
I think the only way to dislodge nvidia is to ape their silicon. a USSR needs to come along, steal their masks, and just make carbon copies. not a shim, just straight up copy nvidia 100%. like, you run `nvidia-smi` and it reports back "yep this is a real nvidia card". and then sell it for half the price. it would be totally illegal, but when youre a state actor You Can Just Do Things.
Anything with an ARM Mali family GPU, same as you'd find for the question "where are Nvidia drivers used" as being "anywhere an Nvidia GPU is used". There isn't a premade list of certain people/companies who might ever used a certain brand GPU in their products, it's "just about anywhere". That can be anything including phones, tablets, mini PCs, laptops, SBCs, TV boxes, VR headsets, and so on - it's not limited to use in a specific product/manufacturer/type of device only.
If you're looking for something hackable to play with the Mali driver options on yourself, Chromebooks or SBCs (like the one in the article) are usually the easiest bet and where the development is done vs the more fixed-by-manufacturer type devices like the typical phone where you get what they decide to package (which may or may not be the particular open driver you're looking to see used).
Hmm, not really. As mentioned above, anything including phones, tablets, mini PCs, laptops, SBCs, TV boxes, VR headsets, and so on. Chromebooks and TVs would just be 2 examples of these types of devices.
As a solid example, the screenshots from the article are not taken from a Chromebook or TV :).
Modern Vulkan code doesn't even use bind groups anymore. For buffers you just push the device address as a push constant and for textures you push the index of a single large texture array