Don't think the scenario perfectly fits because before generics you couldn't have a method that takes in an Array of Animal, just an Array.
Can a language could resolve this issue without generics or some other contract feature (which Java did later add), or without limiting the scope of array definitions and their use?
May be good to note that Array<Dog> doesn't actually extend Array<Animal>, so there shouldn't be any polymorphic usage between them, and the return statement and method call scenarios are compile time errors with proper generics.
> before generics you couldn't have a method that takes in an Array of Animal, just an Array.
Arrays aren't generic types like ArrayList. Arrays are core language constructs parameterized by their element types. Array types are not erased in Java.
Somewhat interesting and related is that software that performs medical diagnoses better than doctors has existed since the 1970s (Mycin, an expert system). It seems like nowadays that this kind of solution can possibly be provided at scale at a consumer level (can process new medical knowledge by itself, be installed on commodity hardware), not an expert myself though so I don't know if totally true. What exciting times though.
"it proposed an acceptable therapy in about 69% of cases, which was better than the performance of infectious disease experts who were judged using the same criteria."
If they are profiting off their work to the extent that it actually makes business sense for the game company to sue them for damages, then that is the action they can expect to get.
Most of the important stuff is certainly done server-side, like determining whether a Pokemon appears at coordinates x,y or not (though my guess is the client is tasked with this, and then says "Hey server, I'm showing there should be a Pidgey at x,y, I'm gonna try to catch it." The server confirms by running the exact same deterministic check on x,y, then says "Yep, I see it too," or "No, you're a filthy liar, Joey.")
But for the actual "I caught it" or "I missed it," unless the exact user input is sent to the server, and then the server simulates the projected path of the throw, the client seems like it gets to say that. So perhaps the client could actually say "I hit it perfectly," and the server just says "Well if you say so."
I've seen the app load something (and hang up while doing that) between pokeball throws. That seems like evidence that it's asking the server about something after each throw / once every n throws.
Checking if a throw resulted in a capture server-side would be weird, I don't see why it couldn't be done on the client. It could be querying for the players inventory, to see if they have enough pokeballs? It'd still be better to just sync that once before the fight...
My best guess is, the conditions for a Pokemon running away involve something only the server knows. A weak example: maybe it considers other players in the area - pokemon might be more likely to run away if the area is too crowded? So every once in a while the client asks the server how many players are there nearby, or something.
I've noticed higher CP pokemon tend to run away more quickly. You also have to remember that there's now great balls and razz berries to influence throw chances, so the client probably publishes the events it's doing (gave a razz berry, threw a pokeball, threw a greatball) and the server creates some randomness that says yes/no.
That "randomness generating" still seems like it would be easily done locally, and it'd reduce server load. But maybe there's some more advanced mechanisms we don't know of.
if it's done locally then people will bodge the client so the random is no longer random. The server's always going to have to do the math, even if to just ensure the client's not lying.
>Checking if a throw resulted in a capture server-side would be weird, I don't see why it couldn't be done on the client. //
It locks up [server access icon spins] a lot on the point at which the ball closes and the game is deciding whether to capture or release the pokemon (what's the heuristic there?), often crashes at that point to.
I wouldn't be surprised if this is because it's sending the whole of the swipe data to the server. Surely it doesn't need to check if there are pokeballs available, when your client says "gotcha" then the server will have record of if there were balls available or not.
To answer your first question, yes you can configure everything; type of ball used, whether the throw is curved or not and the "excellence" of the throw. No one gets excellents throws all the time so I think it's easily detectable and hence bannable.
Regarding the reliance on Emacs, might not be the case anymore. When I was new to Clojure, I immediately used an IntelliJ plugin called Cursive. It was/is great, same experience as writing Java in IntelliJ. (https://cursive-ide.com/)
But the problem is you really do need a "paredit" for your editor when developing in Clojure. It's a double whammy for beginners, and a huge one at that. Pretty much every other language ever is perfectly editable in everyone's editor as it exists today.
Check out parinfer! It's a great new editing mode for lisps that doesn't require memorizing any special commands. It's built into Cursive as well as my own IDE (Nightcode).
You don't even need a full-blown "paredit", just the ability to jump back and forth between different kinds of opening and closing braces. I've been using vim for Clojure exclusively since I first started dabbling with it in 2011, I'll die before succumbing to emacs... A bigger issue though is that to really sell Clojure you need to sell the alternate mindset of developing it represents as a lisp. With emacs, you can assume slime, and immediately you're exposing people to developing their code iteratively instead of the traditional approach of edit file, save, run from scratch.^1 With every other editor you need to first give beginners^2 something to approximate slime^3, or else you're going to present developing in Clojure the same as you'd present developing in Java, which is to say you'll lose a lot of the punch.
^1 As an aside this is why I'll never be satisfied with plain Java/JavaScript no matter how many new static features they get so long as they remain fundamentally opposed to the build-up style of programming.
^2 By beginner I mean to Clojure; I don't think Clojure is a good language for programming beginners. It is suitable for people as a second or later language, but not their first. So basically I agree the need to explain the preferred editing environment is a hurdle -- if the person's first language was Scheme, maybe there wouldn't be a problem since you don't really-really need paredit or syntax highlighting or auto-indent or anything besides the ability to edit and save text.
^3 For me that's just a separate terminal in a screen session that a vim plugin talks with, though there are others (and probably better, I haven't tried them seriously) for vim.
As a beginner I found that learning a single keyboard shortcut - Ctrl+W - "expand the selection to the surrounding semantic unit" was a huge productivity boost (the opposite action is Shift+Ctrl+W).
Coupled with the setting "Settings→Editor→Smart Keys→Surround selection on typing quote or brace." it's amazing.
My point is you have to use a paren tool to keep lisps sane. That's not true of any other type of language. So when trying to introduce someone to clojure, they see not only a very strange language but a new tool requirement as well. It really creates barriers to entry.
And for those of us who prefer editors (vs IDEs), Proto REPL for Atom (https://github.com/jasongilman/proto-repl) is a great option to work with Clojure. I think it deserves more attention, as it can help a lot with Clojure adoption.
I am guessing you made jeans before, if so can you share and estimate the cost of materials and time it takes for one pair of jeans? Am curious because $225 for time and materials seems like too much to me.
Perhaps these guys in the video run too small of an operation to have the costs spread around and to invest in tech to speed up the process.
Not jeans, but rather very expensive dresses, and at a very small scale.
1500$ dress, has probably $150-200 in material (bought at/near wholesale prices), I'm sure that the jeans in the above video probably have about $20 in material.
Labor is the killer: Cutting, sewing and finishing one of those dresses is a 20 - 30 hour job. There are other costs too, some ongoing like customer service, shipping, marketing, some one time, like grading patterns so that you can make more than one size of an item, or getting photos taken of it for your store.
There isn't really a lot of "tech" in clothing manufacture. Sewing machines haven't changed much in 50 years, the fact that 50 year old equipment is STILL used for production should tell you something about the industry. It is HARD to automate something that is so nuanced.
The way it happens in an industrial setting is the same way its always been done. Lots of people doing the same job over and over on an assembly line. Yes some things have been automated to some degree but not to the extent they have in other industries.
Edit: Yes I have made jeans for myself before, but never at scale.
There's a few pictures in the various Github issues, looks like a Scratch like programming environment, unless that was just the developers own software.
Similar to that, I really like how in Intellij I can can use my Maven and Gradle build script as the IDE project's property file. I can now have my source checkout not include any helpful IDE specific files, but still have the project be instantly imported into an IDE.
A possible fix is to start investing, but it may be too late. If I remember correctly I think investing for profit is a gray area in Islam, but Kuwait did it and if I remember correctly they use to have more returns from their investments then they did on oil and paid every citizen a base salary from it (until Saddam invaded and they had to cash out).
Trade and investing is permissible in Islam.
However interest and derivative financial instruments are strictly prohibited for various reasons, one of which is trying to avoid "renters' economy.
They are capable of making world class products still that drive revenue. Their fantasy football UI is the best IMO, and they managed to use that platform to drive their daily fantasy gambling that they have tied to it. Hopefully they could repeat it for Yahoo Banking etc.
Can a language could resolve this issue without generics or some other contract feature (which Java did later add), or without limiting the scope of array definitions and their use?
May be good to note that Array<Dog> doesn't actually extend Array<Animal>, so there shouldn't be any polymorphic usage between them, and the return statement and method call scenarios are compile time errors with proper generics.