Advertisement

A Daft Statement

Started by May 26, 2013 08:26 AM
36 comments, last by LorenzoGatti 11 years, 7 months ago

What I failed to communicate was that I think your opinions are not based on first hand experience but on your readings.

Yes, my readings. Here's an example of my readings:

http://vlsi.cs.ucf.edu/nsf/files/GeneralWaveletProductIntegral_4.7.06.pdf

That is the one which caused me to start this topic. Can you find why? I don't think you can.

In fact, I haven't read much else other than papers like this. These days when I'm reading, it's usually some news article, content from this website (gamdev.net) or a few things from Wikipedia, but I mostly read academic papers related to computer science and mathematics. That's about it, honestly. I heard about Lee Smolin very long ago, and when I first mentioned him I even cringed about it because I knew some one was going to bring it up. That's partly why I'm annoyed.

My point in a nutshell:

Practical mathematics operates in the interest of pure mathematics, and you take the formalizations you've been given much less creatively than you claim. You really do. Which features of Roman numerals are better than Arabic? Is it best for computers to use binary, and does mathematics really support such reasons? Of course you can reason with mathematics to support the validity of your reasons, but the math is likely second-hand reasoning. I feel like everyone likes to pick up all the arbitrary and normative reasons handed to their ass and perceives them as inherent features of mathematics. You like to claim that's not true... whatever. This also occurs in... everything else. It has a lot of stupid side-effects, but it's just how humans build things. Human nature, though awesome, can often be annoying.

Throughout my life I have met many very smart teenagers who had strong opinions about a lot of things. Generally these opinions are born out of not being aware of what they don't know, or perhaps they are just a form of self-expression, where they feel the need to put forth strange theses to proclaim to the world how different and ground-breaking their thought is.

Although I rarely read books (my wife says I am illiterate in three languages), I did read a couple of books by Nietzsche many years ago and I got the same feeling as when talking to one of these teenagers. Well, perhaps Nietzsche was more interesting than most of them.

In general I find their opinions (including Nietzsche's) somewhat childish and ultimately not very interesting.

I can't decide whether Reflexus is one of these teenagers or just a troll. In either case, *yawn*...

Advertisement

I can't decide whether Reflexus is one of these teenagers or just a troll. In either case, *yawn*...

ohmy.png

Edit:

I just looked up "Nietzsche"

I hope I'm not as rediculous as he. I think you're suggesting that I am. Fine :/

where they feel the need to put forth strange theses to proclaim to the world how different and ground-breaking their thought is.

lol. My thesis is: indicable.

In general I find their opinions (including Nietzsche's) somewhat childish and ultimately not very interesting.

I know exactly what you mean. I called this "A Daft Statement" for a reason.

I'm a mathematician and I have actually produced something worth publishing. I'm currently working as a graphics programmer, so I am in some sense also a computer scientist. I have no problem understanding the paper you have posted, but I don't see how it is related to your thesis. Since you can't really provide any argument to it and I'm annoyed, I don't think I will reply anymore. Bye.

I did not get the point of this thread and the supposedly daft statement.

What are you trying to tell us?

Maybe you should give an example or something.

You really do. Which features of Roman numerals are better than Arabic? Is it best for computers to use binary, and does mathematics really support such reasons? Of course you can reason with mathematics to support the validity of your reasons, but the math is likely second-hand reasoning. I feel like everyone likes to pick up all the arbitrary and normative reasons handed to their ass and perceives them as inherent features of mathematics. You like to claim that's not true... whatever. This also occurs in... everything else. It has a lot of stupid side-effects, but it's just how humans build things. Human nature, though awesome, can often be annoying.

Setting aside the vaguely mathematical angst about human nature, I know a couple of things about computer implementations of number systems, and I can tell you are confusing a number of different levels:

  • Formally right or wrong. All number systems "work" in the sense that they can represent numbers and be used for computation, but some are better than others.
  • Special good and bad properties. Properties are formal, but what makes them a feature or a problem are technological factors. For example, number systems with only 2 symbols are a good match for electronic circuits, but not for many types of mechanical calculator. Inventing different representations using 2 symbols is an interesting mathematical challenge driven from concrete applications; nobody thinks "inherent features of mathematics" are involved.
  • Fitness for a purpose. Nobody in their right mind thinks there is a "best" number system; it depends on what has to be computed and on a number of budgets and constraints; all applications require trading off representation size and/or electronic circuit size against computation speed (e.g. avoiding carry propagation in addition by storing 2 bits per digit) or initial and final conversion effort against computation effort (e.g. enduring BCD arithmetic because for few calculations it might cost less than converting to and from binary). Normal people don't have any particular "arbitrary and normative reasons" affecting these engineering choices; using bad technology (e.g. most arithmetic with Roman numerals) is usually the result of honest ignorance, not of epistemological biases.

Omae Wa Mou Shindeiru

Advertisement

Yes, my readings. Here's an example of my readings:

http://vlsi.cs.ucf.edu/nsf/files/GeneralWaveletProductIntegral_4.7.06.pdf

That is the one which caused me to start this topic. Can you find why? I don't think you can.

I'm guessing you found some of their "variable names," offensive...

If you can solve my last two created topics, I will consider you intelligentph34r.png

"Computer scientists must usually avoid thinking about problems in formal mathematics"

Show us how you got to that conclusion and you will find that will be a better thing to write about.

I use mathematics when developing software frequently, of course. But I do strictly seperate this from computer science.

Formally right or wrong. All number systems "work" in the sense that they can represent numbers and be used for computation, but some are better than others.
Special good and bad properties. Properties are formal, but what makes them a feature or a problem are technological factors. For example, number systems with only 2 symbols are a good match for electronic circuits, but not for many types of mechanical calculator. Inventing different representations using 2 symbols is an interesting mathematical challenge driven from concrete applications; nobody thinks "inherent features of mathematics" are involved.
Fitness for a purpose. Nobody in their right mind thinks there is a "best" number system; it depends on what has to be computed and on a number of budgets and constraints; all applications require trading off representation size and/or electronic circuit size against computation speed (e.g. avoiding carry propagation in addition by storing 2 bits per digit) or initial and final conversion effort against computation effort (e.g. enduring BCD arithmetic because for few calculations it might cost less than converting to and from binary). Normal people don't have any particular "arbitrary and normative reasons" affecting these engineering choices; using bad technology (e.g. most arithmetic with Roman numerals) is usually the result of honest ignorance, not of epistemological biases.

Oh, we're getting somewhere. Yes! Also, consider these points with regards to such premises and elements' influences on the direction of practice, development and method i.e. working with 10 symbols might, surprisingly, isolate you from thinking about benefits which are not exclusive to working with 2 symbols but are more obvious and more familiarly practiced in 2-symbol representations. We've thought of some awesome things which have been developed for binary systems, e.g. two's-complement representation. These solutions were designed to emulate operation which doesn't necessarily depend on how many symbols there are. I've not encountered much furthering work regarding such potential. For example, matrices are a cool device which have been pioneered for many purposes that already had special methods but now can be elegantly generalized to a single form of representation; a matrix. Same applies for tensors. I hope mathematicians continue to innovate with representation in this way, but I've seen a lot of activity to the contrary in practical mathematics, particularly in the field of computer science. Don't forget to consider departing from a generalization to explore different sets of assumptions and vastly more appropriate representations etc.

I only disagree with your last statement, but we probably perceive epistemological bias slightly different to eachother. I'd be more comfortable to have a discussion about epistemological bias.

p.s. excuse my convoluted writing style. I'm slightly mad (as in, crazy) when I communicate. I can't form smooth sentences without taking the time to shave everything down. xD

using bad technology (e.g. most arithmetic with Roman numerals) is usually the result of honest ignorance

It's straight forward and easy but tedious and convoluted. Binary arithmetic is tedious but scalable and simple to implement. I honestly don't see what's so bad about Roman numerals. Of course we believe they're ineffective for most modern applications. I think they have merit and bear principles that have a lot of potential, but we deny them merely for the flaw of Roman numeral's character of complexity. Symbolic algebra is very complex and inelegant. Roman numerals use principles which could augment algebras to be much more conservative in their nature of expression and manipulation (i.e. simple and elegant to implement), and of course these principles stand appart from roman numerals themselves. I had this idea originally when evaluating two's-complement arithmetic and fixed-point arithmetic. The principles are quite congruent to the inherent procedural nature of Roman numerals. You probably think im batshit stupid now. Whatever.

I'm guessing you found some of their "variable names," offensive...

Fair guess.

This topic is closed to new replies.

Advertisement