Advertisement

A Daft Statement

Started by May 26, 2013 08:26 AM
36 comments, last by LorenzoGatti 11 years, 7 months ago

So daft, yet with my experience, I really believe it: Computer scientists must avoid solving problems in formal mathematics.

There you have it. For now, it's probably futile for me to attempt providing any argument in support of this statement. So I'll just start with one issue: Are you able, in any way, to find concurrence between your own views and this statement? Feel free to augment the statement to something which conforms slightly more to your belief. Otherwise, I guess we're just at odds.


That's the topic. Discuss.


I will make a more particular augmentation for myself:

Computer scientists must usually avoid thinking about problems in formal mathematics (excluding lambda calculus) and strongly avoid solving problems in formal mathematics.

Edit:
One very important detail is that this exclusively applies to problems, not existing theory or solutions.

You seem to think there is an inherent difference between computer scientists and mathematicians. I think I am both, so I can't possibly understand your statement.

Advertisement

You seem to think there is an inherent difference between computer scientists and mathematicians. I think I am both, so I can't possibly understand your statement.

Thanks for your feedback. I'm sure you understand it is a fallacy to argue: for because you are both, my statement must be false; however, fortunately you did not argue that. I'm grateful for your contradiction. I believe there are many problems in computer science which many parts of mathematics gravely fumble with. I suspect it is your intuition to deny this.

How can you ever hope to accomplish anything without using formal mathematics? Even your programming language's syntax is formal mathematics (thus the name "formal language"). So by writing a program you do actually use formal mathematics.

Also, without formally finishing your derivations, you will write unneccessarily complex code, which will often lead to slower code.

On the other hand, there seems to be no sensible reason not to use mathematics. Or can you name even a single one to back up your statement?

I guess you cannot implement certain systems that heavily depend on the underlying hardwares features and traits using formal mathematics (not counting the programming language as one here)

But the "non optimized"/brute force algorithm which you are implementing in the above, or a simple mostly-mathematical algorithm, will be more cumbersome to get working by assembling the system in your head and throwing in a couple of operators until it works, than it would if you used proper math to derieve the formulas.

Personally i see it as follows:

1. Obtain formula (through knowledge, google or derieving it yourself)

2. Transform formula into non-brute-force form by applying random optimizations and a couple of tree based operations here and there

3. Implement optimized formula

The only reason we cannot or should not use "formal" mathematics from the start to the end is because:

*Simpler algorithms are sometimes easier to figure out without a language at all

*Mathematics stops being useful when you want to move from the brute force version to the optimized one (a significant part of the program can be about the optimizations, especially the large scale ones where you save precalculated values and handle their storage etc.)

Eg. what formal mathematics give us is a brute force algorithm for figuring out what the physically correct color of a pixel is. Then one needs to modify this formula by cutting insignificant parts out, approximating others, parallelizing it, taking into account temporal/spatial coherence etc.

The latter part is usually dependent on the hardware and things like appriximation are often subjective, so it doesnt fit into formal mathematics as well as the first part does. I am not saying it couldnt be a part of formal mathematics however, maybe we just lack the language for that stuff, and even if we had a language, it would be very specific and thus probably not widely used outside of professional/academia.

o3o

Did you make this thread because of my attempt to apply calculus to my economics problem?
Advertisement

Only formal mathematics exists. Informal mathematics is just wrong and useless.

Maybe you are actually contrasting CS "practical" mathematics (readily applicable to writing software) and pure "impractical" mathematics (useful only for pure knowledge or further research), but even this dichotomy is quite meaningless.

Omae Wa Mou Shindeiru

Only formal mathematics exists. Informal mathematics is just wrong and useless.

Is informalized mathematics wrong and useless, as it is informal and has not been formalized? If you assert so, then mathematics as a field of thought must be static: this means its development always follows a rigid and eternally consistent process, yet this process itself is unchangeable and inherent to the meaning of "mathematics." In that case, you can not let yourself to believe its incompleteness promises to consistently develop all possible solutions when targeted for practice by an interpretation. Continue by reading: https://en.wikipedia.org/wiki/Completeness#Logical_completeness

Please consider studying Lee Smolin's view on mathematics. He explains a philosophy which is very much concurrent to mine. Of course, I do have my own fanatical quirks appart from him.

Maybe you are actually contrasting CS "practical" mathematics (readily applicable to writing software) and pure "impractical" mathematics (useful only for pure knowledge or further research), but even this dichotomy is quite meaningless.

No. I'm not.

How can you ever hope to accomplish anything without using formal mathematics? Even your programming language's syntax is formal mathematics (thus the name "formal language"). So by writing a program you do actually use formal mathematics.

Programming languages are not fundamental to computer science. If my simple statement is true -- which I'm not requiring you to accept -- then your prompts are not effective. Don't expect an appropriate discussion on this issue until later. My argument here doesn't have enough context yet, so we'll need to wait for the discussion to develop in other areas.

On the other hand, there seems to be no sensible reason not to use mathematics. Or can you name even a single one to back up your statement?

The burden is on you. You even have a contender.

Also, without formally finishing your derivations, you will write unneccessarily complex code, which will often lead to slower code.


This. I want to write a second post to fully respond to this. I'm not really in gear to make an adequete respond immediately, so I'll try to discuss this in detail soon.

Continuing

The burden is on you. You even have a contender.


The contender is an indefinite space of methods which may contain anything from a monkey adding symbols of a formal grammar to a statement at random, given an infinite duration of time to compose such statements, and the assurance that the most optimal solution will eventually be produced, to a mind of ultimate intelligence which may produce the solution in an instant. The former method is known as the infinite monkey theorem.

After learning what it's all about, read this section:
Random Document Generation

Due to processing power limitations, the program uses a probabilistic model (by using a random number generator or RNG) instead of actually generating random text and comparing it to Shakespeare. When the simulator "detects a match" (that is, the RNG generates a certain value or a value within a certain range), the simulator simulates the match by generating matched text.


This illustrates the common place issue of merely identifying "optimal solutions." Declarativeness has become a matter. Consider the fact that imperative languages have a better record for understandability than declarative ones.

... I need to go. I'll just post this here and continue what I was saying as soon as possible. Soon I will complete the inappropriate answers I started in the previous post, and I will also address other comments which I have not even responded to yet. Bye. sad.png

Heaven forbid computer scientists from solving problems involving set theory. Computer scientists should never mess with sets. rolleyes.gif

Out of curiosity, how does someone who only just turned 18 years old come to such a "daft" conclusion?

[size=2][ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

This topic is closed to new replies.

Advertisement