*very*clearly written, it is a whole different level from any book on Calculus I've worked with before.

Ravi Mohan's Tech Blog. To read my non technical blog, click here

## Thursday, December 25, 2008

### Two levels of mathematics

I've written about the connection between programming and mathematics before. My latest discovery is that there are two "levels" (for lack of a better word) Mathematics - one inhabited by engineers and one by mathematicians. This divide seems to carry over into books as well - thus, taking Calculus as an example, Gilbert Strang's book is an Engineer's guide and Spivak's book is for mathematicians.I am working through the latter, and while it is

## Friday, October 3, 2008

### An Unsatisfactory Proof

To be proved

"If the symmetric difference of sets A and B is a subset of A, prove that B is a subset of A"

symmetric difference of A and B = A U B \ A ^ B (using U for union , ^ for intersection and \ for difference).

What I've come up with

The Proof

Let x be an arbitrary element of B.

Then x is also an element of A (case 1) or it is not (case 2).

case 1.

If an arbitrary element of B is an element of A then B is a subset of A .

case 2.

x is an element of B but is not an element of A. In which case, it is an element of (A U B) \ ( A ^ B), ie it is an element of the symmetric difference of the sets A and B. Since it is given that the symmetric difference of A and B is a subset of A, x is an element of A.

Since x is an element of A in both cases and since x is an arbitrary element of B, we conclude that B is a subset of A.

Q.E.D

I keep getting the feeling there is a more elegant way to prove this. If anyone wants to write to me (or comment here) I'd appreciate it much.

PS: - the proposition is obviously true (draw a Venn diagram to see why). I am asking for help on the (structure of the) proof itself.

"If the symmetric difference of sets A and B is a subset of A, prove that B is a subset of A"

symmetric difference of A and B = A U B \ A ^ B (using U for union , ^ for intersection and \ for difference).

What I've come up with

The Proof

Let x be an arbitrary element of B.

Then x is also an element of A (case 1) or it is not (case 2).

case 1.

If an arbitrary element of B is an element of A then B is a subset of A .

case 2.

x is an element of B but is not an element of A. In which case, it is an element of (A U B) \ ( A ^ B), ie it is an element of the symmetric difference of the sets A and B. Since it is given that the symmetric difference of A and B is a subset of A, x is an element of A.

Since x is an element of A in both cases and since x is an arbitrary element of B, we conclude that B is a subset of A.

Q.E.D

I keep getting the feeling there is a more elegant way to prove this. If anyone wants to write to me (or comment here) I'd appreciate it much.

PS: - the proposition is obviously true (draw a Venn diagram to see why). I am asking for help on the (structure of the) proof itself.

## Friday, September 5, 2008

### The key to understanding proofs

is to realize that the exposition of a proof is distinct from its derivation. The (published) exposition is a very condensed description of the what and why. The (usually unpublished) derivation is about the how (the proof was derived). Reading carefully between the lines of the exposition one can often discern a faint outline of the creator's thought process but that's about it.

A (weak) analogy is having the binary code of a program that does something wonderful.Without the source code(the step by step derivation of the proof), one has to use disassemblers (digging into the exposition)to figure out how it works. Having the source code would be better in some contexts, but most proofs are "binary only".

Having said that, nothing stops me from starting from the givens and trying to prove the conclusion myself, which is a worthwhile exercise in itself. ("worthwhile" as in "what does not kill me makes me stronger" ;-) )

A (weak) analogy is having the binary code of a program that does something wonderful.Without the source code(the step by step derivation of the proof), one has to use disassemblers (digging into the exposition)to figure out how it works. Having the source code would be better in some contexts, but most proofs are "binary only".

Having said that, nothing stops me from starting from the givens and trying to prove the conclusion myself, which is a worthwhile exercise in itself. ("worthwhile" as in "what does not kill me makes me stronger" ;-) )

## Saturday, July 12, 2008

### The JVM and Tail Recursion

In an attempt to investigate the performance of a trampolined interpreter on the JVM, I discovered something interesting.

First, some background. In essence, trampolining is a strategy used to avoid stack overflow on platforms (like the JVM) that save to the stack even when not necessary , e,g while doing a tail recursive function call. So after doing a CPS transform and then trampolining it by returning a lambda wrapped "thunk", my interpreter *should have* handled an infinite tail recursive call *without* a stack overflow exception. But it failed anyway!

After Dan Friedman pointed out (by email) that "If you use exceptions in Java, you will regain tail recursion. I think that is the only way to get around Java. ", I did some digging and found something even more interesting.

Tail recursive (java) functions are properly handled by some java JVMs (specifically the JITs) and not others! See this article for a good explanation.

Try running Listings 4 and 5 (from the referenced article) on your JVM :-D . I suspect that C# code equivalent to the two listings in the article (numbered 4 and 5) would both run perfectly. Can someone with a Windows machine and dotNet installed let me know if it works? (post a comment here or send me email! Thanks in advance!)

Footnote: the conversion of the tail recursive function (in the article) into the while loop form is one example of how a trampolining interpreter could "break stack build up".

First, some background. In essence, trampolining is a strategy used to avoid stack overflow on platforms (like the JVM) that save to the stack even when not necessary , e,g while doing a tail recursive function call. So after doing a CPS transform and then trampolining it by returning a lambda wrapped "thunk", my interpreter *should have* handled an infinite tail recursive call *without* a stack overflow exception. But it failed anyway!

After Dan Friedman pointed out (by email) that "If you use exceptions in Java, you will regain tail recursion. I think that is the only way to get around Java. ", I did some digging and found something even more interesting.

Tail recursive (java) functions are properly handled by some java JVMs (specifically the JITs) and not others! See this article for a good explanation.

Try running Listings 4 and 5 (from the referenced article) on your JVM :-D . I suspect that C# code equivalent to the two listings in the article (numbered 4 and 5) would both run perfectly. Can someone with a Windows machine and dotNet installed let me know if it works? (post a comment here or send me email! Thanks in advance!)

Footnote: the conversion of the tail recursive function (in the article) into the while loop form is one example of how a trampolining interpreter could "break stack build up".

## Wednesday, July 2, 2008

### Directions

This is the first post on my brand new blog. The essential difference between this blog and the old one is a change of "voice" and focus.

Unlike the "catch all" nature of the old blog, this one will have a sharper focus, specifically on things technical (mostly code/mathematics/algorithms etc - but "technical" is used very broadly - I could very well end up dissecting the Black Scholes Equation for pricing derivatives, for example.). I will create a separate blog for non technical ruminations (TBD).

So what's happening on my end? Life is insanely busy (as usual) and I am meeting great people (as usual) and situations (as usual) and the great difficulty (as always) is choosing among the zillion challenges to tackle next. "So little done, so much to do", to quote Cecil Rhodes, but hey it beats a boring, "normal" life any day.

On the technical/programming front, I am investigating the impact of various (language) interpreter designs on execution time. Over the coming weekend, for example, I am trying to get some sense of how much overhead trampolining adds to interpretation time (vs a normal recursive descent parser).

So that is what I'll write about in the next few entries. And then there is the reinforcement learning library I wrote for the good folks in the DRDO (Defence Research Development Organization for non Indian readers). I could probably spin that off into an open source project once I (find some time to) tidy it up a bit.

Then there's the derivation of concurrent algorithms, the relational-algebra to SQL query generator thingy I've been fiddling with off and on for a while now, and thoughts on programming with typed channels and machine learnng algorithms as programming primitives. Lots of interesting things to write about.

Stay tuned.

Hello Word! It is good to be back.

Unlike the "catch all" nature of the old blog, this one will have a sharper focus, specifically on things technical (mostly code/mathematics/algorithms etc - but "technical" is used very broadly - I could very well end up dissecting the Black Scholes Equation for pricing derivatives, for example.). I will create a separate blog for non technical ruminations (TBD).

So what's happening on my end? Life is insanely busy (as usual) and I am meeting great people (as usual) and situations (as usual) and the great difficulty (as always) is choosing among the zillion challenges to tackle next. "So little done, so much to do", to quote Cecil Rhodes, but hey it beats a boring, "normal" life any day.

On the technical/programming front, I am investigating the impact of various (language) interpreter designs on execution time. Over the coming weekend, for example, I am trying to get some sense of how much overhead trampolining adds to interpretation time (vs a normal recursive descent parser).

So that is what I'll write about in the next few entries. And then there is the reinforcement learning library I wrote for the good folks in the DRDO (Defence Research Development Organization for non Indian readers). I could probably spin that off into an open source project once I (find some time to) tidy it up a bit.

Then there's the derivation of concurrent algorithms, the relational-algebra to SQL query generator thingy I've been fiddling with off and on for a while now, and thoughts on programming with typed channels and machine learnng algorithms as programming primitives. Lots of interesting things to write about.

Stay tuned.

Hello Word! It is good to be back.

Subscribe to:
Posts (Atom)