Are you saying the www consortium should be paying to keep Tailwind development and maintenance going? The css standard is not the same as a usable library of components.
> others who deem Tailwind valuable enough will continue to maintain it.
We have seen several examples in the last couple of years where this is simply not true enough. There are multiple open source projects that do not receive enough TLC.
If my company relies on an open source project and it isn't being maintained, I can either ask my company to start maintaining it or find something else or accept the risk of a an unmaintained project.
The problem here is that the w3c sucks a fat one, and they've failed to build software specs that don't require an ecosystem thousands of libraries to make using CSS, etc. simple or efficient.
The only part of windows that really matters in the long run is win32 which has been extremely stable. You could go back to XP and not lose that many features. The fact that modern windows runs like ass has very little to do with backwards compatibility.
I don't necessarily disagree, but I do think there's an important distinction between technical debt and backwards compatibility. Yes The former can be caused by the latter, but I've worked in enough projects that didn't have to worry about backwards compatibility but were still riddled with technical debt to know that backwards compatibility is only one source of many.
Something to consider is that the sub pixel layout of OLED is an engineering necessity to achieve longevity and cost (panel yield) objectives.
LCD can have a uniform layout because it's a passive layer doing the filtering. In OLED, each pixel is active and that blue one is trying to burn itself out much faster than the other two.
QD is one big blue emitter. The geometry has to be optimized to spread the wear out accordingly. WOLED has a 4th large, white sub pixel because the RGB filters cannot pass enough light for high luminance scenes.
On windows I create a new locked down user with NTFS permissions denied everywhere except the target project path. I then run the agent app as that user with otherwise unrestricted powershell access.
FedEx is easily the best carrier in my area right now. It's not even close. If I need absolute certainty that something will arrive, I'm paying extra for the overnight priority option. All of my employers have seen it the same way. Even the holiday gift packages are sent FedEx.
USPS has been a disaster by comparison. We've been dealing with actual criminal elements in my local post office stealing and tampering with mail. I don't know of anyone in my community who hasnt filed a complaint with the USPIS. I don't even care about the packages anymore. I'm more worried about the tax forms, vehicle titles and replacement credit cards getting lost now.
Any merchant who only offers USPS or uses them as the last mile of delivery is a no-go for me. Amazon is the most stressful experience because you never know. The best approach I've found is to use their pickup box and batch things to coincide with my grocery trip.
Obviously you're summarizing a frustrating problem, so my suggestion may be unhelpful. However, mail tampering is a serious crime. If everyone is complaining, and the USPS is doing nothing, it should be handled as a crime.
I'm not familiar with US law in this, but I know the FBI has stepped in for some cases that are over state borders. Anecdotally, that is.
And I also know that private citizens can bring criminal cases to the prosecutor if the police aren't. If there are a lot of you, it wouldn't be too expensive to jointly hire a lawyer. Because this is absolutely not normal, at all.
And it should be squashed, and hard, with people led away in cuffs.
If this problem is persistent, the local city should get involved too.
> With a small light source even a small change in position on the surface has big effects on the light’s visibility – it quickly becomes fully visible or fully occluded. On the other hand, with a big light source that transition is much smoother – the distance on the floor surface between a completely exposed and completely invisible light source is much larger.
This part of the demo illustrates the point vs area light issue really well. In designing practical 3d scenes and selecting tools, we would often prefer to use 2d area or 3d volumetric lights over point lights. Difficult problems like hard shadows and hotspots in reflection probes are magically resolved if we can afford to use these options. Unfortunately, in many realtime scenarios you cannot get access to high quality area or volumetric lighting without resorting to baking lightmaps (static objects only; lots of iteration delay) or nasty things like temporal antialiasing.
There is a solution called Radiance Cascades [1] which doesn't require a denoiser for rendering real-time shadows for volumetric lights. Unfortunately the approach is relatively slow, so solutions based on denoising are still more efficient (though also expensive) in terms of the quality/performance tradeoff.
One issue with modern ReSTIR path tracing is that currently the algorithm relies on white (random) noise, which contains low-frequency (large-scale) noise, which produces blotchy boiling artifacts at low sample counts. Optimally an algorithm should use some form of spatio-temporal blue noise with exponential decay to only get evenly distributed high frequency samples. But that's still an open research problem.
Having come from graphics in the 90's, practical high-performance answers typically involve fakery on both primary surface shading and shadow calculation.
I've pulled some tricks like "object-pre-pufficiation" (low-frequency model manifold encapsulation, then following the same bones for deformation) mixed with normal recording in shadow layers (for realtime work on old mobile hardware), but, these days, so much can be done with sampling and proper ray-tracing, the old tricks are more novelty than necessary.
> My team has a sandbox where we create, deploy, and run AI-generated code without a human in the loop.
I think if you keep the human in the loop this would go much better.
I've been having a lot of success recently by combining recursive invocation with an "AskHuman" tool that takes a required tuple of (question itself, how question unblocks progress). Allowing unstructured assistant dialog with the user/context is a train wreck by comparison. I've found that chain-of-thought (i.e., a "Think" tool that barfs into the same context window) seems to be directly opposed to the idea of recursively descending through the problem. Recursion is a much more powerful form of CoT.
That's not what that image means at all. If you look closely, you'll even see 3 additional colors, plus white, from the 4 I'm guessing you identified.
Those are ERCOT load zones, a distinct concept and all within the ERCOT interconnection (grid).
On the markets side, Texas is made up of ERCOT, and then has portions in (descending order) MISO, SPP, and the non-market West.
In terms of "grids" Texas is mostly ERCOT, and then the Eastern Interconnection with a small smidge of Western Interconnection in the far west in El Paso Electric's territory.
If you're trying to solve one very hard problem, parallelism is not the answer. Recursion is.
Recursion can give you an exponential reduction in error as you descend into the call stack. It's not guaranteed in the context of an LLM but there are ways to strongly encourage some contraction in error at each step. As long as you are, on average, working with a slightly smaller version of the problem each time you recurse, you still get exponential scaling.
In the case of CSS, we already have that:
https://www.w3.org/Style/CSS/Overview.en.html
reply