Advertisement

On leaving .NET

Started by September 24, 2010 09:48 AM
31 comments, last by ranakor 14 years, 4 months ago
Quote:
Original post by frob
Choose a tool:


What client asks for:
What they really need
What they can afford
what's the building in that first photo?
Advertisement
It looks like a boat hotel in Singapore my friend showed me a picture of, but I can't confirm that. I don't imagine there are too many buildings that look like that though.
You know, I don't know how relevant this will be, but lately I've been pondering, let's just say as a thought experiment, what is so bad about languages and tools we(me included) have dismissed as 'quaint' and inefficient.

For example: C. Or C++, if you decide to use select parts, like the SC++L, templates for straightforward generic programming and not those mind-bending-template-turing-complete-compile-time-metaprogrammming-whooolookatmemooom voodoo stuff, classes for their built-in "security" facilities and so on. Why does that *have* to be ugly and unproductive? For instance, sure, C doesn't support automatic memory management, but that doesn't mean you can't code something like that yourself. It can range from something very very simple, if your needs are simple, or something more advanced, or using an existing library that does it. Point is, C does allow explicit pointer arithmetic, but that doesn't mean it *has* to be all over the code, does it? If anything, C gives you some basic building blocks that allow you to use them to structure your program in a way that makes sense to *you*, or your team.

And this brings me to another thing: It's true that with such languages you will wrestle with a lot of 'boilerplate code' sometimes. But what I don't understand, is what is wrong with some code generation at such circumstances. Especially for things that are currently dealt with with 'design' patterns. I mean, with some amount of 'boring' boilerplate code, we can achieve things like sophisticated multiple dispatch(just an example) in C. Of course, such code will be hell to maintain; but why does it *have* to be maintained by a human being? What would be wrong to create a programming environment, or write an extension to an existing one, that generates and maintains this sort of code automatically?

I keep seeing that we're improvising, trying to solve problems that already have well-defined, mathematical solutions. Theoretically, we could use C to write the "meaty" stuff of our program, those routines that deal with the domain that our application is targetted to, and have our environment churn out in the background C code for things like memory management, message passing, dividing work into threads, with us just designing the whole thing using some GUI tools. That auto-generated code might be readable by humans, or it might not; but really, who cares when it's some script that deals with it anyway? The argument, which I've used myself in the past, that in this case you won't be using C anyway so you might as well move to another language, somehow makes less and less sense nowadays; you *do* use C where it counts: Do we really care if some repetitive code, that is there because it's just necessary to be in order to prepare a structure for the actual code to be ran, is invisible to us and we control it by a more friendly UI?

Anyway, those are my thoughts at the moment...just thought I shared.
Quote:
Original post by ranakor
what's the building in that first photo?
Hotel in Singapore. It's in the skyline of the Formula 1 a lot right now.

Capn:

I can't imagine that real-world development in Ruby/Python is more than a single-digit percentage of jobs out there. It's very 'cool' but I wouldn't want to stake my career on it. What do you do that everyone is using these?

Your other real options are C++, Java or .NET. The former is dead unless you're doing something specialist like games or whatever. Java is great for the open-source community but you will have to put up with people constantly professing amazement you use Windows, and the language is a little unexciting. .NET has the MS stigma but if you don't care what other people thing, C# is a very exciting language - all the new functional stuff and ability to also work in F# is pretty neat.
Advertisement
Quote:
Original post by d000hg
I can't imagine that real-world development in Ruby/Python is more than a single-digit percentage of jobs out there.
Depends on the job sector, and what you define as 'real-world'. If you browse some of the rent-a-coder sites, there are a ridiculous number of people trying to build the next facebook using ruby-on-rails...

Web development is a *huge* sector of programming jobs, and the focus seems to have shifted (at least with the newer/smaller companies) away from ASP and PHP, and towards Python and Ruby.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Quote:
Original post by d000hg
Capn:

I can't imagine that real-world development in Ruby/Python is more than a single-digit percentage of jobs out there. It's very 'cool' but I wouldn't want to stake my career on it. What do you do that everyone is using these?

Your other real options are C++, Java or .NET. The former is dead unless you're doing something specialist like games or whatever. Java is great for the open-source community but you will have to put up with people constantly professing amazement you use Windows, and the language is a little unexciting. .NET has the MS stigma but if you don't care what other people thing, C# is a very exciting language - all the new functional stuff and ability to also work in F# is pretty neat.


I wouldn't call it dead as there is many jobs in C++ that are still wanted even if it's not in gaming or whatever is your definition of "whatever". I wouldn't consider C++ ever dying. .NET is overrated. While it is pretty fast to create applications in, it limits in gui that is cross os. Gtk# is awful and wouldn't code in it. Other than that, nothing as WPF is Windows only. Yes, I know not a lot of people on this site don't care about Linux or Mac, but some due, since their job is working on all 3 OSs, else I'd jump on the overrated bandwagon already. I still prefer coding in C++. But everyone has their own opinion. I do web development in Python using django and quite happy with it after I figured it out. At least it's better than PHP.
In the web dev start up world it's all about Rails with a smaller portion about Django. I've been trying to find a technical co-founder on an app for a while and it's been rails or go home for most of the quality people I've ran in to.

I really like Silverlight and C# as a combination, but it's hard to find good hackers that are in to C# that aren't beat down from a full time production job.
"Let Us Now Try Liberty"-- Frederick Bastiat
Quote:

I recently ran across a blog posting that echoed a lot of personal feelings that I hadn't yet figured out how to articulate.

It's a very hard decision. I'm very productive in .NET with C#. But I do often feel like it's me against the world here. Certainly none of my coworkers are any help; I am the one that helps them, not the other way around. All of the people I like talking to about programming in real life are doing Python or Ruby (or PHP, but that's not a reasonable consideration). I'm the odd duck in the crowd when I show up to developer meet-and-greets in Philly.

It seems like everyone else gets to work on these fun projects and I'm stuck re-writing the same damn SQL-based report viewer sites. Is it true? Or am I not looking in the right places? I wager there is a LOT more .NET dev going on than Python, certainly than Ruby (and I don't mean to focus on these two languages, they are just easy examples). Could it be that there are equal amounts of good work in both, but the .NET market also has a thick layer of bullshit on top?


I think the root of the problem is that .NET is viewed as an "Enterprise" solution, and once you get that label, it's really hard to shake. Middle managers love "enterprise" solutions, but to hackers it's a death sentence, and they stay far away.

In my experience, there's really two types of programmers. There are the ones that do it because they're just really passionate about programming, and there are the ones that do it because it's a steady job and they vaguely like computers. The second type of programmer I think constitutes the vast majority, and they tend to gravitate towards whatever they think has the best chance of giving them consistent employment. These aren't the kind of people that go to developer meet and greets :-)

Both Java and .NET are very "enterprise" oriented, and as such, people tend to learn those in order to get a job. The problem is, programming is really really hard even if you're good at it, and it's also really hard to even know IF you're good at it, and the 9-5ers tend to not be very good, and tend not to know it, because you need to practice a lot in order to be good. And so I think that poisons the community a little bit, because talented developers look at, say, the Java community and see a lot of schmucks writing FactoryFactories and decide to stay far far away.

Of course, the sad thing about all this is that .NET is actually a pretty solid platform, and Microsoft has done a lot of things to basically pull a lot of the most useful aspects of functional programming and bring it to more developers in a way that's not alien and confusing, so it's not stunted like Java is. But it's got the whole Enterprise stink to it, so I doubt it's ever going to appeal to the talented-hacker crowd (unless maybe Mono takes off).

This topic is closed to new replies.

Advertisement