Putting the ``Backwards'' in ``Backwards Compatibility''?

chip on 2000-05-30T19:22:58

Every time we upgrade Perl, it seems that something breaks. Sometimes that something is very small, but occasionally it's large like "@" in strings. Sometimes it's on purpose; usually it isn't. But what is the ideal? What can we -- and users -- reasonably expect from Perl?

This is a subject I'm obviously interested in because I'm working on Topaz, which will become Perl 6 if it works out. And reimplementing a whole language makes breakage almost inevitable. So since some things are going to break, it could be argued that we shouldn't worry about it. And maybe that's true for an upgrade from Perl 5 to Perl 6... In particular, Larry has said that for Perl 6, everything that's officially deprecated in Perl 5 is fair game for deletion.

But there's another perspective that's worth considering. I recently had a mail exchange with a programmer who loves Perl but decided not to use it for his product, because he couldn't rely on every Perl program he writes today continuing to work for the indefinite future on all upcoming versions of Perl.

I think there are three major questions raised by this story.

  • Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?
  • If not, what does that mean for Perl advocacy? Should we really be encouraging people to use Perl for systems that are deployed far from maintenance programers?
  • Would it be worthwhile to resynchronize the documentation and the regression tests so that every documented behavior is tested?

I invite perspectives on these issues from everyone....


C?

pudge on 2000-05-30T19:58:08

I think it is worth noting that not all C programs work forever, either. Compilers and libraries -- even *gasp* standard ones! -- change. Very well-written C programs should continue to run on modern compilers, but sloppy ones won't. C programs that exploited hidden or undocumented features can break. Hey, this is beginning to sound familiar ...

Perl might be in worse shape than C in this regard, but it is something to ponder.

But C can be compiled...

chip on 2000-05-30T20:06:05

... and the compiled form is likely to work for a very, very long time.

Maybe what we're seeing is that byte-compiled Perl is more important for reliability than anything else.

Clean out the cruft

kirbyk on 2000-05-30T20:10:00

It's fairly important, IMO, to be able to clean out old code every now and then. This means rewriting, and that means things will break.


However, so what? If someone has some code that won't work in Perl 6, they can keep running perl 5.x indefinitely. It's not like we're taking anything away from anyone. Odds are fairly high that if something breaks, they'd be well advised to rewrite it anyway.

I'd rather have a language that was well maintained and up to date, with efficient code, than one that was backwards compatible to every idea anyone ever stuck into it.

Re:Clean out the cruft

chip on 2000-05-30T20:15:26

But can we really tell them to keep Perl 5 and Perl 6 forever?

What if a security problem arises with Perl 5 ten years from now. Will we care? Do we care if someone finds a (new) security problem today with Perl 4?

I think it's inevitable that, at some point, bit rot will set in and Perl 5 itself will be deprecated. The key questions are (1) how long that can be delayed, (2) how long we think it should be delayed, and (3) how that should affect what we do and recommend for users.

Re:But C can be compiled...

pudge on 2000-05-30T20:28:35

Sure, but how many unchanged, compiled programs from 10 years ago still work? And how many of those do you still use?

Re:But C can be compiled...

chip on 2000-05-30T20:36:09

Aren't there people still using SunOS 4 binaries under Solaris? Isn't it possible for those binaries to be ten years old? How about commercial apps for Xenix that still run under emulation today?

A raw count of old binaries doesn't really address the point, IMO, because the whole universe of computing has expanded so quickly that even if all old binaries were still used, they'd be outnumbered by the new ones.

More to the point: For those who have chosen as their goal the creation of code that works for ten years, can we recommend Perl as an appropriate tool? If so, what constraints does that place on them and us?

Re:But C can be compiled...

pudge on 2000-05-30T20:55:56

I just use the "count" as an illustration: for the overwhelming majority of code out there, such longevity is not an issue. Even sometimes when the compiled code is still to be used, the OS has been upgraded and the binaries break. Unless you are going to make sure the OS stays the same, you probably count on a program of significance still working in 10 years. And who is going to be using the same OS in 10 years?

Yes, some people will. And I'd have to say that unless you can compile a Perl program, then no, unless you are willing to not upgrade your Perl, which is perfectly fine. What if a security hole is discovered? Well, what if there is a security hole in your OS somewhere? What if you have a compiled C binary and in 10 years you find a security hole, but you lost the source? :-)

Ten-year-old MPE programs are common

ashted on 2000-05-30T23:04:03

Working under MPE, we have a lot of programs compiled 10 years ago which are still happily working every day. I'm pretty sure we even have one or two which were compiled in the 70s and which still run today.

For a shop like ours with lots of inhouse code, the idea that upgrading the language may suddenly cause widespread breakage is a pretty scary thing.

At the very least, getting the test suite as complete as possible so that we can tell people what is going to break is important.

Re:Ten-year-old MPE programs are common

chip on 2000-05-30T23:38:45

Wow, MPE. That brings back memories....

If modern MPE is anything like the MPE IV (?) I used in the early 80s, it's extraordinarily stable--just the kind of system you'd expect to be upgraded about once per decade.

I heard a story about an MPE system that got turned off in the middle of a really long COBOL compilation. When the system was repowered, the OS started doing lots of active disk stuff but didn't generate any output. Thinking that it was wedged, the operator write-protected the hard drive in preparation for shutdown. (Back then, disks were diskwasher-sized and had their own power and write-protect buttons.) The MPE kernel beeped--and MPE never beeps. But it didn't just beep once. It beeped three times. Then it printed the hard disk equivalent of: ``Put The Candle Back!'' The startled operator complied. Soon thereafter, the system resumed operation as if the power had never been lost.

I don't know if the story is true, but it's certainly believable to this MPE user.

PS: Does the HP-3000 architecture still live?
PPS: Does MPE still ship with a ``ttymon'' program? }:-)

Can the likelyhood of breakage be classified?

jzawodn on 2000-05-31T00:32:21

Can anyone (from p5p, maybe) look at Perl today, and tell us what the likely breaking points are? Functions will change their return values? Syntax changes? Garbage collection?

Maybe a line should be drawn clearly in the documentation between what you can rely on and what behaviour is considered "undefined" or a "mere side-effect of a particular operation which you really can't rely on".

Hmm. It is an interesting question.

Maybe "Perl" should be defined separately (somehow) from "perl". Much like there are several implementations of Java, there *could* be several compliant implementaions of Perl. In the process of doing that, you'd know what Topaz must not break.

Would that be practical? I wonder...

Re:Can the likelyhood of breakage be classified?

chip on 2000-05-31T01:35:02

Larry doesn't want to separate Perl and perl. There would be drift and differences, and that would be more harmful than the variety would be beneficial. Or so he surmises. I think he's right, too.

We already have some classification of breakage.

  • Bugs. Oops. 'Nuff said.
  • New keywords can be introduced at any time, so calling non-imported subroutines without using "&" is asking for trouble.
  • Indirect object syntax for method calls is particularly fragile in the face of new keywords.
  • Some features are officially deprecated and may go away, like $[ and $*.
  • Action at a distance may surprise existing code; for example, it's possible that a new warning or warning-related feature could confuse a $SIG{__WARN__} function.

More?

Re:Can the likelyhood of breakage be classified?

pudge on 2000-05-31T02:24:13

>New keywords can be introduced at any time, so calling non-imported
>subroutines without using "&" is asking for trouble.

I thought it was asking for non-avoidance of prototypes and cleaner looking code. Sigh.

Break, Broke, Broken

AutoPerl on 2000-05-31T12:30:05

The chief problem is time -- I need to use Perl, not nursemaid it. I use Perl partially because it is a RAD language that lets me write a lot of features in a short time, and because it is so portable. I lost an entire weekend when I upgraded to Perl 5.6 and it broke my DBI and DBD modules! I had to dig out the old Perl, reinstall it, install my database modules again. In short, I don't have time to do full regression testing on every line of code I've ever written in Perl. I need to know Perl won't break, or I'll just never upgrade. Right now, I'm going to be on 5.005 for a long, long, long time because I don't have time to dig around and upgrade all my modules and stuff.

And, if Perl 6 won't run existing Perl 5 code as-is, it's utterly useless and will die a horrible death like COBOL 9x and FORTRAN 9x that no one uses because they're not backwards compatible. Even the new C 9x standard is going to die horribly because there's zillions of ANSI C code in use, and zero lines of C 9x code. All these languages are great improvements, but useless because the code won't work with old compilers.

Perl is the ISP code of choice, and either 1) ISPs will ignore Perl 6 because it breaks everyone's code and stay with Perl 5, or 2) they'll upgrade and break everyone's code -- you can guess which is more likely to happen. People are going to standardize on Perl 5 just like they did COBOL II, and like they will ANSI C 89. These new languages will just die from not being used.

Perl 6's mission has to be to run every valid Perl 5 program FIRST -- this has to be the goal for Perl to survive. Learn the history lesson from these other languages!

RE: Backwards Compatibility

drbob on 2000-05-31T12:34:28

Backwards compatibility is GOOD. I can still run (re-compile) most of my FORTRAN-IV programs, and yes, I consider F90 an abomination.

However, some of my F-IV programs didn't work, but I was able fix them because the changes in the language were well documented. That's OK, provided simple, routine structures aren't changed (for example, I would assume that $_ in Perl 6, Topaz, etc. would still be modified by the s/.../.../ command.

Bob

Some thoughts ...

alleria on 2000-05-31T12:53:05

Like others have pointed out, it's not necessarily appropriate for all current Perl code to work in all future versions, although it's also obvious that breaking 'basic' features would cause lotsa angst.



Besides, aren't deprecated features the icky stuff like allowing the user to set what the first element of an array is, etc. that really should go away anyhow?



It's true that Perl is now being used in large-scale projects in many cases, sometimes replacing huge chunks of C/C++/Java, but one would suspect that in such large-scale projects, maintainance programmers should be nearby.



Conversely, for situations where there are no maintainance programmers, one would expect a dearth of sysadmins as well, and thus no upgrade from Perl 5 to Topaz (and thus, no compatibility problems).



Some have mentioned the situation of 'what happens if Perl 5 has a big security bug that gets fixed in Topaz and people want to upgrade?' ... And I think that probably the best solution would be to just have Perl 5 continue to be maintained until it is Perl 4 caliber in terms of stability/security, and then just freeze it at that.



(I seem to remember Larry, I think, saying how some shops are still using Perl 4 because Perl 5 isn't stable, and causes some breakage? I don't really see how the difference between 5 and Topaz would be that great, unless you're really planning to break a truly huge amount of current features.



One last thought comes to mind: ensuring compatibility with modules on CPAN. A lot of Perl users will want to upgrade to Topaz, but keep on using their dear CPAN modules. Suffice it to say that if CGI.pm broke, then "hell hath no fury ... " ;)



But even in lesser cases, a user might be inhibited from upgrading to Topaz because his modules don't work with it. Again, it's probably the module's author's responsibility to update, but that doesn't always happen as much as we'd like, especially in the case of unsupported modules of various sorts.



(and I do shudder to think what any module that currently links to a C library will have to have done to it to work under Topaz ...)






But I mean, overally, Topaz is just the greatest thing! Hopefully you're thinking of things like line disciplines, compiler hints, and possibly spinning formats off into a separate module that may even allow more flexible usage?



I know that some of the above, for example, might break compatibility (screams of outrage at those that actually use formats, hehe), but IMHO the benefits outweigh the potential problems.



My $0.02,

-- Alleria

Re:Ten-year-old MPE programs are common

ashted on 2000-05-31T12:56:42

Modern MPE is still extraordinarily stable :-). The HP3000 has become the HPe3000 and is generally built around a PA-RISC processor (the same hardware, pretty much, that HP-UX runs on) and will be available on IA-64.

The story you tell sounds typical. I was recently talking to a linux fan about the stability of the box and went back and pulled the power loose from our 947. The box, of course, went immediately silent. Then I plugged the power back in. Once the drive spun up, and was active for a short bit, the console commented that there had been a power failure and life went on with no other effects. Logged-in sessions were still logged in and sitting exactly where they were before, etc :-).

These days, though, MPE has not only the extraordinary stability and numerous features which MPE users have long loved, it also has Perl, Java, Apache, Sendmail, etc. and is continuing to provide a rock-solid foundation for business in the "Information Age".

P.S. Nope, so far as I know, the "ttymon" is gone.

Re:Can the likelyhood of breakage be classified?

ashted on 2000-05-31T13:10:58

I'll second that.

CPAN Compatability

pudge on 2000-05-31T13:13:48

The problem is so much code on CPAN is poorly written. Not most of it, but a significant enough amount of it to mention. Much of the time, when perl "breaks" something on CPAN, it is the CPAN author's fault.

This is not always the case, especially with a major update like perl 5.6. However, I think making sure things on CPAN don't break is not a worthy goal for perl porters; it is a worthy goal for CPAN authors and CPAN testers.

Oh, and the CGI module ships with perl; I think it is clear that it will always be a goal that modules that ship with perl will work with that version of perl.

Re:Can the likelyhood of breakage be classified?

chip on 2000-05-31T14:03:16

You know, now that you say it that way ....

I wonder if it might be possible to reverse the current rules someday. It's probably more robust over the long term to default to calling a subroutine with the same name as an operator.

Re:Some thoughts ...

chip on 2000-05-31T14:07:56

Yes, the deprecated features are the ``yucky bits''.

Your point about CPAN is well-taken. I think cpan-testers will help a lot.

As for interfacing with C, I'm expecting to create a compatibility layer so many existing XSs can continue to work, albeit with some efficiency hit. Fortunately, writing the equivalent of an XS for Topaz will be a lot simpler than XSs today.

Silent Changes are the Worst

chip on 2000-05-31T14:11:04

I hope not only to document language changes, but to detect any potential breakage at compile time.

I know I don't mind language changes much, as long as there are no silent killers.

Re:But C can be compiled...

doug on 2000-05-31T14:12:27

I think a lot of this depends upon the platform.

At work I use Solaris, and we expect things to continue as they've been for quite a long while. At home I use Linux, and I don't get bothered when things stop working. It is a cultural issue. Some people prefer that everything remains stable, even at the expense of new features/functions. I usually prefer to update code and spend the time keeping my software up-to-date if I feel that I'm getting something for it (p6 must be better than p5). If I may (over)generalize: I think that my attitude is more common the Linux community.

I image that the same split exists in the Perl community. There will be some people who want new features and will pay whatver price, while there are those who want yesteryear's code to run without any problems. There is no way of making both groups happy, so you will have to pick your battles carefully.

When you're talking about breaking things, what do you mean? I've wanted a

        use depreciated;

type command to allow $[ and all the other out of date concepts, but then again I'd like

        use anonymous;

to enable $_ because I prefer to name things.

But I think you mean more than that. Any concrete examples?

Getting back on topic, I don't think it is that big of a deal if p6 is not 100% backwards compatable with p5 as long as the p5 source is available. If someone wants to use the old stuff, go for it. It would be better if p5 were maintained a bit after p6. I don't mean new releases, I mean more of fixes/updates to keep the existing stuff working.

Peoples failings shouldnt hold back development

jns on 2000-05-31T14:32:40

Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?

Well yes and no. I mean that there was a lot of chat in comp.lang.perl.misc after the release of 5.6.0 that would indicate that people never tested the new version of Perl with their existing programs before going live with the new release - this is quite simply bad practice. People blundering into the installation of a new version of anything without a proper implementation plan deserve everything they get and I dont think that Perl is any different - in the end you cant hold back the development of a piece of software simply in order to allow those people to carry on in their own sweet oblivious fashion. Of course perhaps a release might be made with greater caveats as regards the testing of pre-existing programs ...

Re:Can the likelyhood of breakage be classified?

pudge on 2000-05-31T14:57:54

I don't know if that is the answer, either. Talk about confusion. But maybe it is the best answer.

Backwards compatibility's a must

Elian on 2000-05-31T15:58:49

Is it appropriate to expect all Perl programs to work forever with all future versions of Perl?
All the ones that use documented features, yep. It's one of the things that marks a solid, well-engineered piece of sofware
If not, what does that mean for Perl advocacy? Should we really be encouraging people to use Perl for systems that are deployed far from maintenance programers?
The existence of maintenance programmers is reasonably irrelevant here. Perl's a language, and that's a low-level-enough thing that it ought to change rarely and with great thought. Features can be added, but should not be removed if at all possible. (They probably ought to remain in the language for ages after they've been removed from the docs) The only thing that ought to change even less frequently is the user-level interface to an OS.

C's been brought up as an example of languages changing, but you'll notice that it's only gone through two major changes (if you want to count ANSI as major) in thirty or so years)

Would it be worthwhile to resynchronize the documentation and the regression tests so that every documented behavior is tested?
Yes! Absolutely and without a doubt. (Heck, if we had that we'd have caught bugs in the 5.6.0 release...) We can't tell if perl is doing what it ought to be doing if we don't have test cases to check everything. This is another hallmark of well-engineered software.

Standards are the key

jjohn on 2000-05-31T19:07:58

If Perl has a standard spec which defined its behavior, then topaz's implementation shouldn't break user scripts. Usually, programs that break do so because they were exploiting a marginal feature. Perl is a reasonably mature these days. It might be time to set some of it in stone.

There is an upcoming interview with Larry Rosler on www.perl.com in which he advocates strongly for the standardization of Perl. Governments and large businesses like standardized languages.

Anyway, this is just a thought.

Re:Clean out the cruft

jerryl on 2000-05-31T19:26:18

I agree that if a new version of Perl breaks an existing program then simply keep the old version of Perl installed until you have time to rewrite.

You can always have multiple versions of Perl installed at the same time and then gradually switch over. At least this is what I am doing for the move to 5.6.0.

JL

backwards-incompatible + write-only = trouble

sethg on 2000-05-31T21:53:38

I'm not concerned about Perl 6 breaking the scripts and modules that I write. I am concerned about it breaking stuff that other people write and that I depend on.

In order to get stuff done in Perl, I use code from other people -- sometimes modules from CPAN, and sometimes home-grown code from people who have moved on to other projects. If there weren't so much code like this available for me, or if making the code work on my system required more than a few "make" commands, I would probably not be doing much with the language at all. I suspect a lot of other novice/journeyman Perl hackers are in the same position.

If this code-from-elsewhere works, I don't care what it looks like. If it doesn't work, making it work right is a time-consuming and painful task. Fixing bona fide bugs in this code, and discovering undocumented quirks in its interfaces, and patching what breaks when I try to bolt on new functionality, keeps me quite busy enough, thank you very much. I don't want to compund my troubles by having the language semantics shift under my feet, especially since the part of the code most likely to be affected by such a shift would probably be the part that's hardest for me to decipher. You can blame the original authors for bad writing or for using deprecated features, but that blame doesn't help me any.

Look at C++. If the designers of C++ had been willing to break backwards-compatibility with C, it could have been a much more elegant language -- but it would have also been a much less widely used language. Or look at Linux. If Linus hadn't modelled his kernel's API on Unix, it would be harder for both applications and human developers/sysadmins to move between Unix and Linux. Or look at Perl; it came (still comes?) with scripts that convert awk/sed scripts into Perl scripts.

Organizations that depend on a large base of installed Perl code are great advertisements for Perl. If these organizations can upgrade to Topaz without putting a significant workload on their in-house Perl hackers, then they'll be great advertisements for Topaz. And remember that when these hackers have a lot of other work on their agendas, it won't take much of a workload to make them say, "oh, the heck with it, we've got better things to do than upgrade".

The essay "Chicken and Egg Problems" discusses how backwards compatibility can make or break a software platform; it's not 100% applicable to free software, but I commend it to your attention anyway.

Re:Can the likelyhood of breakage be classified?

Abigail on 2000-06-01T00:12:04

It's probably more robust over the long term to default to calling a subroutine with the same name as an operator.

Perhaps. On the other hand, AFAIK, there are no keywords in Perl that start with a capital letter, followed by a lowercase letter. If it isn't already formalized that Perl will never have mixed case keywords, it won't break anything to do so. Which means that people who want to protect themselves against suddenly have a keyword with the same name as a subroutine can use mixed case subroutines.

-- Abigail

Re:Break, Broke, Broken

Abigail on 2000-06-01T00:15:07

People are going to standardize on Perl 5 just like they did COBOL II, and like they will ANSI C 89.

People have never standardized to an old version of perl, regardless of how many code was broken. Perl 5.004 was even maintained for quite some time after perl 5.005 came out, but there wasn't any sign people were going to "standardize" on perl 5.004.

-- Abigail

Case by case basis

Abigail on 2000-06-01T00:40:13

My position is it depends. When I was still participating in p5p, in almost any discussion the compatability with older versions argument was being raised.

However, I found that it was used very often as a political argument. When a chance in Perl was proposed, someone opposing the feature would come with a construct that was claimed to be used in thousands and thousands of programs and those programs would break. Of course, the quoted construct you would never see in Perl usenet groups, web sites, journals, CPAN modules or on p5p, except as an argument agains a proposed chance. But if there was another proposal said person would like, said person wouldn't have a twisted construct up the sleeve.

In the discussion about the ?? operator, it was pointed out to Larry when he proposed the name ?? that there it was theoretically possible a program would break. He said that for such seldomly used constructs, he didn't care if a few programs would break.

And that's basically my viewpoint. It's a trade-off. Compatability is important, but it's not the ultimate goal. If a change in Perl breaks a very seldomly seen construct, and it has a lot to be gained, then that shouldn't stop the change. The few programs that will break should either be rewritten, or on a few hard disks, people will keep an older version of Perl around. If that keeps a handful of people from programming Perl, so be it; a lot more people will be happy.

But there is no point in breaking compatability if there's nothing to be gained. $[ is an often quoted example of something that can disappear. Granted, it doesn't seem to be used often, and most people think that assigning to $[ is bad coding anyway. But what is to be gained from removing $[? If there's nothing to be gained, than there's no reason to break compatability. On the other hand, introducing ?? might break a few programs, but there's lots to be gained. It's one of the most requested features I've seen in the past years.

Compatability changes should be considered on a case by case basis, and the plusses and minusses should be compared. Unfortunally, I think those are hard to quantizes objectively; only discussion could lead to an answer.

-- Abigail

Keyword cases

chip on 2000-06-01T01:51:41

I think it's already formalized that Perl keywords never include upper-case letters.

But I hope we can recommend a prettier style than "Myfunc($x)", because I know the mere idea of having to capitalize every function name gives me the screaming heebie jeebies.

Re:Standards are the key

chip on 2000-06-01T01:54:07

Well, Larry has basically considered defining Perl separately from the perl manpages a Bad Idea. His comment was, IIRC: ``I will be certified before Perl is.''

Besides, if you think the C++ standardization process was a mess, just imagine an ISO Perl committee trying to decide which regex features to bless...!

Re:backwards-incompatible + write-only = trouble

chip on 2000-06-01T01:58:07

Your point is well-taken; and I had originally hoped for 100% compatibility. Who knows, I may actually achieve it.

I think it's clear that language evolution will have to be marked in the source code. We already have use strict, so someday we may have use large for programming-in-the-large (i.e. no language features that facilitate quick hacks at the expense of maintenance).

Re:Keyword cases

Abigail on 2000-06-01T04:10:44

I think it's already formalized that Perl keywords never include upper-case letters.

Perhaps not keywords, but there are words in Perl that have special meaning, and are written in full caps (which is why I suggested mixed case, and not full caps). CHECK might technically not be a "keyword", but if you had a sub called CHECK in your pre-5.6 Perl program, it didn't work quite as well when running it with Perl 5.6. Perl 5.005 introduced INIT. And if your program breaks because of a change like this, the technical detail of the change being a keyword or a special case identifier is of little help.

-- Abigail

Re:Break, Broke, Broken

AutoPerl on 2000-06-01T14:03:06

>>> People have never standardized to
>>> an old version of perl

Not yet -- but there hasn't been a version of Perl that breaks all that much, either. Version 5.6 is the first one I've upgraded to that completely destroyed my Perl development environment (and I admit I did not compile it myself but installed a binary).

I'm just trying to point out the history of other languages, like C, FORTRAN, and COBOL, and how some later versions of some of these got to be largely ignored and used only in specialized cases because they broke or changed the language enough to cause problems with existing code and implementations. It's worth looking at the fate of COBOL 9x and FORTRAN 9x (and I'm sure C 9x will be similar with all the ANSI C code and compilers out there) before deciding how radical to get with Perl.

Re:Keyword cases

chip on 2000-06-01T19:16:06

Yup, you're right. INIT and CHECK are counterexamples.

On the other hand, if the parser could distinguish between "INIT" and "sub INIT", we could add new INIT-like features without breaking compatibility.

Glitches in 5.6 upgrade?

chip on 2000-06-01T19:18:39

Perl 5.6 ``completely destroyed'' your environment?

Would you please spell out the problems you had? The examples could be instructive for the future.

Besides, I'm curious ... I haven't deployed 5.6 myself yet, and I'd like to know what to expect.

On the other hand, language adaptation is much slower than language development, so just because there's an N month delay in deployment of 5.6 doesn't mean that 5.6 is doomed.

Re:Break, Broke, Broken

glauber on 2000-06-01T22:20:36

It could be because i compiled it, but upgrading to 5.6 didn't break anything here (though i had some fun when installing a new version of DBI, later, because Makefile.PL didn't find the old DBI directory).

All in all, it was painless.

Of course, i ran 5.6 on a "development" machine for about 1 month before installing it on production.

glauber

Don't break anything that's documented...

glauber on 2000-06-01T22:44:44

If Topaz will be successfull, it shouldn't break any documented behavior. If it breaks any hacks and exploits, or if it breaks deprecated stuff so be it...

If you are antecipating serious compatibility problems, perhaps there could be a tool to scan a Perl 5 program and flag potential problems.

One of the things you are up against is, Perl 5 is very good. If Topaz breaks things, people won't have a very compelling reason to use it.

glauber.

Re:Break, Broke, Broken

Dullman on 2000-06-02T00:04:46

There is something vaguely un-perlesque about this entire subject. There is laziness enough, and impatience, certainly. I think what it lacks is hubris. Unfortunately, AutoPerl is quite right. Out in the trenches, this is no small issue. I, for one, shepherd a small number of perl scripts on many hosts in a large distributed environment. They will be, I'm afraid, around for a comparatively long time. (Hopefully, longer than their creator). It's a major manufacturing environment and breakage is simply not an option. I would be greatly annoyed at having to roll major changes just to upgrade perl; (such annoyance is, I believe, de rigor for perl programmers). However, a minor change, such as "use depricated;" would not make hard what should be easy. Surely, the language should be able to evolve without turning to a pillar of salt just for looking back; this should not have to be one of those "hard fatherly choices" which, by their very nature, upbraid the LIH philosophy. I would feel cruelly disillusioned if the honored keepers of the flame did not rise to this challenge and give the yearning masses a technical solution that will make the hairs on the backs of their necks stand up. There is glory and honor to be had here. This is the hubris we have come to cherish -- and expect.

Re:Break, Broke, Broken

chip on 2000-06-02T00:45:53

For some reason, while reading Dullman's comment, I heard The Star Spangled Banner playing in the background.

Played on kazoos.

Maybe I should have that looked at.

Re:Don't break anything that's documented...

chip on 2000-06-02T00:46:57

Yes, Sarathy et al set a high standard. I wouldn't have it any other way.

Re:Glitches in 5.6 upgrade?

Abigail on 2000-06-02T01:17:25

Besides, I'm curious ... I haven't deployed 5.6 myself yet, and I'd like to know what to expect.

The only problem I had was that one of my JAPHs no longer works in 5.6. Unfortunally, it's one of the JAPHs I intended to include in my YAPC talk.

I perlbugged the issue a while ago, but there wasn't any response (not even "that's not a bug, that's actually a bugfix!") so far.

For the curious, the issue is that if you have *foo = *bar;, and then call foo() while neither sub foo or sub bar are defined, hence triggering AUTOLOAD, the value of $AUTOLOAD differs between 5.005 and 5.6. But I highly doubt that that is a showstopper for rolling out 5.6 in production.

-- Abigail

Re:Glitches in 5.6 upgrade?

chip on 2000-06-02T06:24:44

WRT interactions of glob assignment and AUTOLOAD:

To be honest, I don't know whether that's a bug or a bugfix.

Re:Clean out the cruft

rodgerd on 2000-06-02T21:09:20

You can keep running perl 4 indefinitely, too, and some people do. But you'll notethat there are well known bugs, including security issues, with perl 4, and no-one is going to fix them. I don't think this is a particularly valid argument; it works as a compatibility strategy while people make modifications to code to help get perl 6 going, but that's about it.

Re:backwards-incompatible + write-only = trouble

alleria on 2000-06-02T22:36:58

While this doesn't exactly solve the problem of the already-written and only marginally supported existing codebase, it would be really nice if the core Perl 5 Porters and Perl 6 Porters could get together, and outline amount, types, of breakage. At least then warnings can be inserted into existing 5.6.x or later versions about things that might cause breakage when Topaz comes?



Might smooth out the upgrade path, I think.