I seriously doubt you'll ever have to worry about this in "real" code, but some comments last night at a talk about "Garbage Collection" got me thinking. The comments were to the effect of how, once, some perl program had lots of data structures and took a "long time" to exit because it first had to clean up all the reference counts. So I tried to come up with a worst case, and threw this together to make a ref to a ref to a ref etc...and was surprised by the result. It didn't take all that long to exit when n was 1 million, but when I bumped it to 2 million, I got a core dump, not when trying to build the data structure, but when the program was exiting. Well, yeah, duh, I guess the reference (un)counting is recursive :-)
my $n = 2_000_000;
my $s = "hello"; my $t = deepref(\$s, $n); print "Made ref: "; my $p = scalar();
sub deepref { my ($s,$n) = @_; my $r = \$s; for (1..$n) { my $t = $$r; $r = \\$t; } return $r; }