Advertisement

A Daft Statement

Started by May 26, 2013 08:26 AM
36 comments, last by LorenzoGatti 11 years, 7 months ago

2's complement style operations don't require base 2 representation. There is 10's complement in base 10... it's just a way to represent negative numbers in an unsigned number system... think about what happens when you wind a mileometer back below 0... it goes to 99999 (or however many digits it has)... so 99999 is the (5 digit) representation of -1 in 10's complement.

"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

2's complement style operations don't require base 2 representation. There is 10's complement in base 10... it's just a way to represent negative numbers in an unsigned number system... think about what happens when you wind a mileometer back below 0... it goes to 99999 (or however many digits it has)... so 99999 is the (5 digit) representation of -1 in 10's complement.

That was my point... Yeah, I know. Sometimes I'm a terribly ineffective writer.

benefits which are not exclusive to working with 2 symbols but are more obvious and more familiarly practiced in 2-symbol representations. We've thought of some awesome things which have been developed for binary systems, e.g. two's-complement representation. These solutions were designed to emulate operation which doesn't necessarily depend on how many symbols there are.

I should have used the word: 'techniques', instead of "benefits". smile.png

Advertisement

I'm going to move this thread to the lounge. The discussion is somewhat interesting, philosophically, and there's been some meaningful interaction. But the thread is not in fact about math or physics, or game development. Therefore, it doesn't belong in the math and physics forum.

grhodes_at_work

Graham Rhodes Moderator, Math & Physics forum @ gamedev.net

Thank you.

'nuff said.

So daft, yet with my experience, I really believe it: Computer scientists must avoid solving problems in formal mathematics.

There you have it. For now, it's probably futile for me to attempt providing any argument in support of this statement. So I'll just start with one issue: Are you able, in any way, to find concurrence between your own views and this statement? Feel free to augment the statement to something which conforms slightly more to your belief. Otherwise, I guess we're just at odds.

For future reference, writing like this is not conducive to any sort of formal debate outside maybe philosophy. Just say, "Can anyone think of a reason why, 'computer scientists must avoid solving problems in formal mathematics,' is false?" It is fine to speak/write like a normal person when you are trying to debate.

Likewise, starting a debate by asking other people to prove you wrong is also flawed, and avoiding their questions by answering them with questions isn't conducive to finding answers. If your goal is to find the right questions, you are debating philosophy. If your goal is to find answers, you shouldn't make such an effort to convolute your point.

edit: Sorry if you find this hurtful, but can you honestly read your first couple posts and understand what you were trying to say or would you like to have that kind of conversation with yourself?

Advertisement

I'm not saying you can't apply mathematics with computer science, but it's a fault to pretentiously utilize mathematics as if computer science is merely mathematics in a slightly construed form. It's certainly not. Yes, you can pretend there's an abstract entity at the center all you want... but you can't justify that in a formal way at all. I'm not pushing against formal mathematics. I did present a hypothetical dilemma:

Is informalized mathematics wrong and useless, as it is informal and has not been formalized? If you assert so, then mathematics as a field of thought must be static: this means its development always follows a rigid and eternally consistent process, yet this process itself is unchangeable and inherent to the meaning of "mathematics." In that case, you can not let yourself to believe its incompleteness promises to consistently develop all possible solutions when targeted for practice by an interpretation.


I think people may have missed my implicit distinction between "informalized" -- implying developing mathematics that has not yet received formal consideration within a consensus (but which probably will after publication & recognition etc.) -- though it has been developed in a considerably formalized way -- and "informally" presented mathematics (i.e. ineffective). Computer science was a construction of mathematics, but that's regarding the process of formalization. I assert that "formal computer science" itself, as mature and different to mathematics as it is, can not continue in a productive way being so anchored into the formal domain of mathematics. It is not sensible to consider computer science an appendage of mathematics any more. I think it is distractive. Of course you may borrow from expressions built with the common formalisms of mathematics, but please consider this borrowing and no more.

My most critical point is also a very subtle, so it is difficult to rigourously debate. It's a red herring to consider a central, formalizable entity between mathematics and the progress of computer science while considering all of the eccentric formalizations which mathematics has accrued until its modern image. To me it seems like many computer scientists are getting spun by mathematics in ways they shouldn't.

Regarding the paper I linked to above, of course I had absolutely no problem with the mathematics it presented. I was disturbed by the overall approach the authors decided to take. In the Problem Formalization section, it's great how well they've formalized the rendering aspects of their research, using formal mathematics. So what's my problem?

There's minimal problem formalization addressing computational aspects. They even could have used mathematics for analysis, I'd be pretty satisfied with that. But there's no computational context at all! Regardless of analysis, how are you supposed to formalize the computational aspects of the problem...? Hey, maybe even with lambda calculus? Lambda calculus doesn't go far enough.

So then, lambda calculus should be extended, right? No. A lot of programming languages feature elements of lambda calculus. Go do all the functional-programming you want. If you actually happen to develop something significant, your method was probably affected by the programming language very little (regarding the significant computational approach, not individual implementation); even considering the strong differences between many languages there are. If the writers of this fore-mentioned paper actually did happen to utilize mathematics to formally analyze any computional issues that may have directed them, the mathematics still would have done virtually nothing to help them as computer scientists who are engaged in the process of developing solutions for fundamentally computational problems. The most formalization computer scientists get to use today (akin to mathematics), apparently, is pseudo-code or a particular programming language. Aren't programming languages just reducable to mathematical grammars, after all? This bears absolutely nothing about the process of computer science. I will also assert computer science, with its real differences to mathematics, has not been appropriately and sufficiently formalized. Secondary to the functional aspects of computer science, it does behave formally as a science, with benchmarks and concrete analysis.

If you reflect on your thoughts carefully, you might notice equivocation problems occurring when you intersect computational problems with mathematical formalization (or likewise notions which respect the common image of mathematics and its approach to abstraction). The original point of this topic was to probe for people who might have had similar experiences to mine. Indeed my original post was very based on first-hand experience.

We can generalize a Turing tape to a set, and we always have. To the opposite, we can build real computational machines. We can even represent models of real machines with mathematics, including temporal and physical constraints; then proceed to analyze the effects of these models. What is missing? A lot of data. Mutability.

The formalism of mathematics is dominated by symbolic expression, but computer science is very much a real science. Most sciences have standard practices and common methods, but the unpredictability -- essentially the notion of science -- forces these methods to be much more dynamic and complex to be expressed in anything but at least natural language, though frequently elaborated with mathematics. I'm talking about method and development. Not scientific theory; the realm of theses.

To say the least, you can't solve all of your computational problems by working solely as a mathematician... as well with any sort of problem, unless you're working like a monkey. Nevertheless, so many "computer scientists" like to contradict me.

You speak of unpleasant first-hand experience with computer science and programming: then you should be able to take the discussion to a much more meaningful level.

  • What specific problems did you have, and what were you trying to do?
  • What did you dislike as excessively mathematical style, or whatever, in the specific computer science theories, papers, advice, etc. you used?
  • What better approaches to those specific theories and programming task do you propose, if any?

If you think you have good ideas to share or interesting problems to think about, examples would be much clearer than vague and cryptic rants; you make a point of rejecting the language and concepts everybody else uses, which is a very bad starting point for discussing philosophy in general terms.

Omae Wa Mou Shindeiru

This topic is closed to new replies.

Advertisement