> Dynamic languages are fast enough to implement internet services and outgrow the demeaning term "scripting language".
This is never going to happen, because "scripting language" is used as a slur or insult, rather than a term with any usefully defined meaning. It's not possible for Python to "outgrow" being a scripting language as long as that's used as a condescending shorthand for "doesn't look like C++".
Consider Ruby. How many applications do you know of which use Ruby for scripting? Or Python -- I think my system has two applications which can be scripted in Python (Gimp and Blender), but many dozens of applications written in Python. If Ruby and Python are scripting languages, then so are Smalltalk, Haskell, Boo, or dozens of other high-level languages. And yet, you'll never hear of these being called "scripting languages" because nobody has an axe to grind against them.
------------
Unrelated to that, Haskell does have pointers (including null pointers) -- you can use them just like pointers in C/C++.
That's a really wonderful link, and I wish more people used the definition it advocates. I've one minor complaint, in that it conflates typeless and dynamically-typed languages, but considering when it was written that's not a big deal -- strong, dynamically-typed languages probably didn't exist in any significant way.
Unfortunately, almost every use of "scripting language" I've ever heard is more along the lines of:
"We can't write our GUI frontend in Python^! Scripting languages are too slow for rendering advanced graphics."
Usually uttered by somebody who's been writing C++ for 20 years and refuses to touch anything more recent.
^ Substitute Boo, Haskell, Scala, Clojure, or anything else without enough curly braces
(I know this is just an example but I can't resist commenting)
"We can't write our GUI frontend in Python^! Scripting languages are too slow for rendering advanced graphics."
After downloading and using Avant Windows Manager for Linux a little bit, I came to exactly that conclusion. It was horrifically slow. Maybe I'm so much smarter than the authors of that program that I could writer something acceptable where they couldn't. But I wouldn't want to take that chance.
Maybe Python gives other scripting language a bad name. I've written large web apps in Python and Ruby but for desktop programs, the choices with acceptable speed that I know of are C++, java, and c#.
Rapid prototype languages make sense for the web. For the desktop, I think the problem is that OSes have been adding crud at the same speed Intel has added speed. Also, the size of hard disks has increased even more quickly and thus an app that happens to index files must index 100x as many, again negating increased processor speed.
Well I'm really not sure what's happening because other dock programs work fairly well.
Also, even the demo video here looks "slow" to me:
http://www.youtube.com/v/mHDL4dIH10I - the icons' changes follow after the movement of the mouse. Perhaps it's just the way the thing is animated - what seems responsive is different than merely running quickly.
strong, dynamically-typed languages probably didn't exist in any significant way
Need I do the Smug Lisp Weenie thing?
That said, a lot of people didn't make the distinction between dynamically-typed and untyped twelve years ago. It was common to refer to Scheme as untyped then, and I think a few people still do.
I haven't used LISP in years, so my information may be out-of-date, but isn't it mostly untyped? I don't remember it having any mechanism for defining types -- the only types were conses and atoms (numbers, strings, etc). There was no way to say "this string is a Name and this is a City, and they are different types".
Common Lisp has several kinds of user-defined types (types, structs, classes). It is possible to write non-trivial code without ever using them, and I don't think they really fit in to traditional Lisp style. Classes are fairly popular in modern Common Lisp code.
I agree with you, but I think jmillikin's point is when most programmers hear "scripting language", they think "dumbed-down programming language" rather than "cleanly separated layers of design".
This is never going to happen, because "scripting language" is used as a slur or insult, rather than a term with any usefully defined meaning. It's not possible for Python to "outgrow" being a scripting language as long as that's used as a condescending shorthand for "doesn't look like C++".
Consider Ruby. How many applications do you know of which use Ruby for scripting? Or Python -- I think my system has two applications which can be scripted in Python (Gimp and Blender), but many dozens of applications written in Python. If Ruby and Python are scripting languages, then so are Smalltalk, Haskell, Boo, or dozens of other high-level languages. And yet, you'll never hear of these being called "scripting languages" because nobody has an axe to grind against them.
------------
Unrelated to that, Haskell does have pointers (including null pointers) -- you can use them just like pointers in C/C++.