I wrote a small script in Perl to calc the N first terms of Fibonacci serie:
#-- fibo script
sub fibo {
return 1 if( $_[0] < 2 ) ;
return fibo($_[0]-1) + fibo($_[0]-2);
}
for($i=0; $i<=40; $i++) {
$f = fibo($i);
print "$i : $f\n";
}
#-- eof fibo.pl
I tried to calc the first 40 numbers, it tooks in my PC about 16 minutes. I weren't worried about writting a fast algoritm, just curious about the performance of a recursive call algorithm.
A friend of me, advertised me that the similar Java version code, ran fast, so I tried:
// Java fibonnaci class
class fibo {
public static long calc(long x) {
if( x < 2 ) return 1;
return calc(x-1) + calc(x-2);
}
public static void main(String s[]) {
for(int i=0; i<=40; i++) {
System.out.println(i + " : " + calc(i));
}
}
}
// EOF Java fibonnaci class
----------------------------
The results (40 terms calc):
1) Perl: 16 minutes
2) Java: 16 seconds (!)
I'm sure that Java does some kind of optimization at the Virtual Machine.
I hope that Parrot team will be able to make a very fast virtual machine...
Re:Memoize
niceperl on 2008-09-13T13:51:53
I'm not looking for a cache model like memoize. I only want to focus on the time taken by each program (Java, Perl) with a similar original code. I think Java does some kind of memoize at low level, doesn't it?Re:Memoize
Aristotle on 2008-09-13T15:04:28
No. It just has fundamentally different execution model.
Re:Memoize
niceperl on 2008-09-13T20:18:34
Please, explain what you want to sayRe:Memoize
btilly on 2008-09-13T22:20:47
Java will look for heavily used sections of code, aggressively optimize them, and rewrite in machine code. That is the reason for the performance difference.
It's also the classic example (it's in SICP) of how a bad algorithm choice can get really slow really fast.