Advertisement

Why weakly typed?

Started by July 04, 2004 07:47 PM
28 comments, last by Zahlman 20 years, 4 months ago
Quote: Original post by liquiddark
The best argument for static typing, in my experience, is a very simple one: Intellisense. With dynamic typing, you often have to look up or remember every function call prototype in perfect detail most of the time. This also happens when you're using a sub-par development environment, but there is really no reason why you should be using such an environment.

Never underestimate tools for boosting productivity. They make all the difference when your brain refuses to start.

ld


I don't think I want to be programming when my brain is off, you know? And there is a lot of stuff which can make a programmer more productive than intellisense, such as having an interactive interpreter, having a debugger that has support for edit and continue, etc. Here's another one that's pretty much specific to Smalltalk: availability of the source for the entire system right into your code browser. This means you can search for anything about anything. You want to know what you can do with strings? Just go see the String class and check out the methods (also check the super classes methods). The code is right there in front of your eyes. You can even modify it if you want! Isn't that amazing?

Another thing that dynamic typing allows: macros. Have you ever seen Lisp macros? They largely kick intellisense when it comes to productivity, they allow for things not possible in other languages. And all attempts at making macros in statically typed languages have failed.
Quote: Original post by GnuVince
there is a lot of stuff which can make a programmer more productive than intellisense, such as having an interactive interpreter, having a debugger that has support for edit and continue, etc.
(...)
Here's another one that's pretty much specific to Smalltalk: availability of the source for the entire system right into your code browser.

I fail to see how these offer more productivity gain. I've been working with VB for 5 years now. Edit and continue is nice, but it's bs in serious development, encouraging you to make the fix right away without consideration of the full scope. I don't even have any idea what you mean when you mention an interactive interpreter - how is this better or worse than an IDE which properly supports watches & value queries?

As for the Smalltalk all-source-is-present angle, if I have to see the source of the library to use it, i consider it broken. I'm writing software to do work, not to tickle my geek button.



Quote: Another thing that dynamic typing allows: macros. Have you ever seen Lisp macros? They largely kick intellisense when it comes to productivity, they allow for things not possible in other languages.

First page I came across on google had the following warning:
Quote: Since macros are so much harder to use than functions, a good rule of thumb is: don't use defmacro if defun will work fine.

Here's my "order of desperation" rule of thumb:
0) Generated statically typed code - Use a designer for most asset creation, all objects are constricted to a static type system. Annoying and unsatisfying, but incredibly powerful and safe.
1) By-hand static typing - interfaces are defined and used throughout, all creates are done early bound. This is my common modus operandi, and the only major problem I can think of with the system is related to deployment in a distributed environment.
2) Late bound creates - interfaces are defined and used throughout, creates use the ProgID or equivalent. Very nice to have, solves some of the problems with distributed operation (tho not all)
3) Dictionary/Reflection-based method calls - interfaces are not defined, creates are late-bound. I'm used to this level of work, and think it's pretty much the last useful technique for software that actually solves a problem.
4) Generated dynamically typed code - generation done at design time, but against a dynamic type system. I consider this to be borderline pathological, as the generator has no guarantees that the items it requires are satisfied by the type system in place thus far.
5) Dynamically generated code - generation done at runtime. This is pathological, because not only do I not know where bad behaviour is coming from, but I have never seen the problem code
6) Self-modifying code - Let's not even discuss it. It's not even going to be possible in upcoming processor models. In fact, #5 probably won't be either.

That gives you some idea where I'm coming from.

ld
No Excuses
Advertisement
Quote: Original post by liquiddark
I fail to see how these offer more productivity gain. I've been working with VB for 5 years now. Edit and continue is nice, but it's bs in serious development, encouraging you to make the fix right away without consideration of the full scope.


It can also mean you can try something without having to rebuild the entire project. Let's say a method expects you to send it an integer, but you forgot to convert the string to integer, with e&c you just remove the quotes and continue and see if the program works. You can later make the adjustment.

Quote:
I don't even have any idea what you mean when you mention an interactive interpreter - how is this better or worse than an IDE which properly supports watches & value queries?


Basically, you write 2+2, you press Enter and the system prints 4. It's a way to test stuff immediatly (interactively). Extremely useful to speed up development.

Quote:
As for the Smalltalk all-source-is-present angle, if I have to see the source of the library to use it, i consider it broken. I'm writing software to do work, not to tickle my geek button.


How is it broken? You don't need to see the source to use a library (especially in Smalltalk since method names are so descriptive), but it's there if you need to know how it works. Having access to source code has been invaluable to me in learning Smalltalk. I could see patterns on how people did stuff, I learned about useful methods that I would have rewritten myself otherwise, etc.
Quote: Original post by GnuVince
You can later make the adjustment.

To be explicit, this is what I meant when I was talking about it being bs. In general people DON'T make the adjustment later.

Quote:
Basically, you write 2+2, you press Enter and the system prints 4. It's a way to test stuff immediatly (interactively). Extremely useful to speed up development.

I agree that this is extremely useful. It's not useful in a test-first environment, however, since your test is required to do this for you.

Quote: How is it broken? You don't need to see the source to use a library (especially in Smalltalk since method names are so descriptive), but it's there if you need to know how it works. Having access to source code has been invaluable to me in learning Smalltalk. I could see patterns on how people did stuff, I learned about useful methods that I would have rewritten myself otherwise, etc.

It isn't going to make you more productive to do this stuff, however. Resources for learning good code habits are better given their own space, complete with documentation on the whys and wherefores.

An alternative phrasing of the above: Given the choice between the ability to scroll through a specific subset of a large library on-demand in a concise way OR the ability to easily get to the source code & thereafter be left to your own devices, which produces higher productivity gains? My automatic answer is the former, and I'd certainly be interested to see a convincing argument for the latter.

ld
No Excuses
Quote: Original post by liquiddark
I agree that this is extremely useful. It's not useful in a test-first environment, however, since your test is required to do this for you.


That's not why I use it. Let me give you an example. A few months ago I wrote a basic dictionary-based http password cracker (in Python) (for educational purposes only, of course; I absolutely did not want to try and get in the member section of www.cutebabeswithbigboobsfromnorthpole.com ;)). I was not at all familiar with the HTTP library, so I went to the python.org website, found the documentation I needed, and with the interactive interpreter, I could immediately try the functions I was reading about.

At a later time, I decided it might be a good thing to get a little bit familiar with C# and .NET. Being a Linux guy, I used Mono, but Mono/C# suffers from the same problem than .NET on Windows, you cannot use it interactively. So, after I found the documentation I needed on MSDN, I had to write a test program to make sure I understood how it worked. But that meant opening my editor, writing code, save & quit, compile, run and repeat as needed. While I eventually got what I needed and was able to code the program, testing the class and methods I needed was more inconvenient.

Quote:
An alternative phrasing of the above: Given the choice between the ability to scroll through a specific subset of a large library on-demand in a concise way OR the ability to easily get to the source code & thereafter be left to your own devices, which produces higher productivity gains? My automatic answer is the former, and I'd certainly be interested to see a convincing argument for the latter.

ld


That would depend on the quality of the documentation I guess. The .NET documentation is numero uno, I must say, but it would still be nice to be able to look at the source (you can if you use Mono, but it's a bit inconvenient). The best would be to have both (documentation and easy source access), but I guess you can't have that :)
I grant you this, that learning in a weakly-typed environment is likely easier. I certainly had not a lot of trouble picking up enough python to be productive when my RPG buddies decided to play online with OpenRPG. My concern is for production code, however, and I think that in this environment the single gain of intellisense outweighs the advantages of "weak" (not really weak, as described above) in most cases hands-down.

ld
No Excuses
Advertisement
An image speaks a thousand words they say
Does it work with user-created modules? I can't find any info one way or the other - all they say is
Quote: IntelliSense has been broadened, and now supports most imported modules.


Definitely a nice boost, tho, regardless. I still prefer the explicitness of static typing for production code, but this rekindles my desire to learn Python.

ld

No Excuses
I love python and its dynamic and strong typing combination. I've never once been bitten by it the entire time using python--about 2 months. After coding with python, I don't think I can bring myself to program anything large in C++ or Java.
Quote:
That's not why I use it. Let me give you an example. A few months ago I wrote a basic dictionary-based http password cracker (in Python) (for educational purposes only, of course; I absolutely did not want to try and get in the member section of www.cutebabeswithbigboobsfromnorthpole.com ;)). I was not at all familiar with the HTTP library, so I went to the python.org website, found the documentation I needed, and with the interactive interpreter, I could immediately try the functions I was reading about.

I did the exact sakme thing with the python socket library. To try it out, I just set up a basic "server" in a python interactive interpreter, and made a basic "client" in another window. It took a little while to get working, but not as much time if I had to actually write it down in a file and execute it.
I don't like the dynamic+strong combination myself, though I may just need to spend a good long time getting used to it. The basic problem is that I find I need to *remember* types of things that aren't written down anywhere. This leads to including type information in comments or documentation instead of a signature, so I still have to write it even though I'm not getting compile-time checking in return. Perl gets around this by not only being incredibly weakly typed, but just plain not having that many intrinsic types. :/

In a week of using Python I've also been bitten by
- unintended infinite recursion in defining a __repr__ (though I don't really need it anyway, so meh)
- the need for explicit 'self.' (again and again and again!)
- bool not behaving as much like an int subtype as I'd expect (mainly: __nonzero__ inexplicably defaulting to __len__, rather than __int__ - for for that matter, __nonzero__ inexplicably not being called __bool__, for parallelism with __int__ and __str__)
- the double-copy semantics of the 'immediate' operators
- the lack of Perl-style autovivification (I suppose most wouldn't expect it though)
- '=' being in statement grammar rather than expression grammar (since this isn't handled as a *semantic* check, you can't do it even when the = would translate into a __setitem__. This results in an IMHO rather bizarre asymmetry: )

>>> class foo(dict):	def __setitem__(self, key, value):		dict.__setitem__(self, key, value)		return "wtf!"	def __init__(self): pass	>>> myfoo = foo()>>> myfoo["bar"] = "baz" # note, nothing displays>>> x = (myfoo["bar"] = "baz")SyntaxError: invalid syntax>>> x = myfoo.__setitem__("bar", "baz")>>> x'wtf!'>>> 


But all the same, I'm still loving the language. :) I've picked up/worked out neat tricks like

- subclassing int for type-safe enums (which can have a custom __str__, and custom creation-from-string!)
- some nice stuff with reflection that makes parsing a whole lot easier - translate string into name of method, feed it into some object's .__dict__, and get a function object, which can then be called with the rest of the input.
- dynamic loading of modules, which together with the treatment of classes *and modules* as objects, allows for some rather neat ideas for packaging stuff.
- doc strings being a module property - lets the program access "help info" that's written inline with the source, rather than having to go and open some file.

This topic is closed to new replies.

Advertisement