I wouldn't have thought that it was possible to be more wrong than Neil Munro's prediction of the result of the Iraq war:
The painful images of starving Iraqi children will be replaced by alluring Baghdad city lights, smiling wages-earners and Palestinian job seekers.
But in a piece scare-mongering about the Y2K problem Munro predicted that Al Gore would be a big loser from the Y2K problem: (National Journal , 20 June 1998)
Loser:
Vice President Al Gore, who fretted about global warming, legal authorities and campaign finance laws while missing the biggest technology problem facing the country.
More like this
Back in November 2001 Neil Munro was an advocate of war with Iraq and predicted:
The painful images of starving Iraqi children will be replaced by alluring Baghdad city lights, smiling wages-earners and Palestinian job seekers.
Iraq war advocates like Munro don't like the results of the Lancet…
John Tirman documents Neil Munro's dishonesty. I think this is an excellent catch by Tirman -- Munro selling his National Journal story to Iraq war architect Michael Rubin:
George Soros funded the survey. The U.S. authors played no role in data-collection, and did not apply standard anti-fraud…
I wrote earlier about William Broad's many misrepresentations in his story that criticised Al Gore's An Inconvenient Truth. Now Kevin Libin has produced an article for the National Post that makes Broad look like a paragon of virtue. Look at this:
James E. Hansen, a NASA scientist and one of Mr.…
As you are undoubtedly aware, this year's Nobel Peace Prize is being split between the Intergovernmental Panel on Climate Change and Al Gore, in recognition of "their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures…
A few more interesting things about Munro and about violence in Iraq.
--- Munro wrote a long piece in NJ in 2001 about Ritalin and the politics afflicting brain research. Some of the same tactics used v. The Lancet (though milder) are evident here. In fact, people who deal with learning disabilities praise Ritalin as a wonder drug, something missing from Munro's account. Munro is listed as a science writer, but he feeds the right's deep anti-rational streak with his misplaced and ill-informed skepticism. That, in addition to Soros Derangement Syndrome (see http://www.medialens.org/), is what inflamed the pro-war extremists---a chance to spit on the scientific "establishment."
--- Munro has further dug a libel hole with comments on the Glenn Beck show (for those of you sensible people who do not know Beck, he is a right-wing extremist who has advocated a nuclear attack on Iran, among other such things):
BECK: OK. He [Lafta] doesn`t -- he also says that they`ve never tried to influence the election, but don`t you -- didn`t you say that they [Burnham et al] -- as part of the condition of giving this study to this organization, that it had to be published before the election?
MUNRO: Yes. They made it clear they wanted this study published before the election. However, they have not said they wanted to change the election results. But one of the authors on the results was running for the Democratic Party seat in that year.
.....
As I've made clear in many previous posts, there was no election-year timing in L2. Most certainly, no one made it a condition of any transaction. Beck repeated this on his radio show the same day (Jan 14):
BECK: "The two coauthors, Gilbert Burnham and Les Roberts, John Hopkins University, they told reporters that they opposed the war from the outset and they sent the report to The Lancet, to the editor and said, you've got to report this right now. There was no time for a peer review. The key person in collecting all of this data was an Iraqi researcher. He had failed to follow the customary scientific practice of making his data available for inspection . . . The Lancet study was funded by George Soros. They sent the report to The Lancet on the condition that it be published before the election. The study couldn't be more unreliable..."
http://www.glennbeck.com/content/articles/article/198/4100/
Untrue, and libelous again.
---Much of what I have argued about L2's essential plausibility (and now ORB's, and the MoH total for excess mortality) cites corroborating evidence. This I tried to present to Munro, and he hostilely refused to listen. For example, 3.5 million or so displaced by the war. There is no war that has a ratio of less than 1:10 of deaths:displaced. Most wars have a much more narrow ratio, 1:3 or so.
Or consider the number of widows. "Where are all the wounded, the bodies, and the widows" was a common question after L2 appeared. There are answers, of course (the bodies are in graves), but this week we have more about the widows: 1 to 2 million, according to government sources. Not all of them from this war, of course, but likely 0.5 million at least. Recall that not all the dead from the war are married men, and you have a very big figure. It's imprecise, but indicative, as are the number of displaced, the opinion surveys of Iraqis, and other suggestive evidence. None of this in Munro, the Sunday Times, the Wall Street Journal, or the many other critics.
On widows: http://www.reuters.com/article/featuredCrisis/idUSKAM744152
The funniest part about the Y2K claim was that by 1998 it was obvious to all but the most addled that Y2K would be little more than a blip, if that. By then the Y2Kers were forced to resort to claiming that the problem would be acute because third world nations would be affected, not realizing that nations where your electricity goes off several times a day at random times were the best equipped to deal with computer outages and blips.
The funniest part about the Y2K claim was that by 1998 it was obvious to all but the most addled that Y2K would be little more than a blip, if that.
I worked for years at a bank as a systems analyst. The banking industry took Y2K seriously. I spent many, many hours making sure my code was clean, and my portion of the portfolio (~US$2B) was small beer. Sure, all of us were pretty confident the further into the project we got, but we spent the time anyway.
Best,
D
where I worked at the time, one section of my organisation had a large dataset of crucial information which had to be available for everyone else, live, for online data access (to give to clients). They had a bodgy old bespoke database they didn't want the Y2K boys touching, so they decided to go it alone and did nothing. On New Years' Day and for about 3 days afterwards their database was completely useless.
The Y2K boys I worked with were from an offsite contractor, who were also working with a major bank. That bank took Y2k so seriously that they had a large command centre in sydney, to report errors and problems back to the European head office so they had a few hours before dawn to correct them.
I would say that a lot of people took this very seriously, and that is the reason Y2K was more of a fizzle than a bang. (In fact I predict that 40 years from now, if our response to global warming is successful, we will be hearing this again from the shills of the day - "global warming was a beat-up, look nothing happened - so why worry about this asteroid issue?")
Y2K was an example of a problem in which:
a) The problem was clear, and was going to cause a mess if not dealt with.
b) To cost-effectively deal with the potential mess, a whole lot of people had to start early, and integrate fixes over a long period of time, so that it was ready on the date.
c) And then, surprise! the resulting problems were small.
A similar problem is that of preparing to hit peak Oil without causing an equivalent of a 1970s downturn, but on a much bigger scale, i.e., see the Hirsch Report:
http://en.wikipedia.org/wiki/Hirsch_report
They say that if we start 20 years before Peak Oil, we could get by with minimal economic problems .... unfortunately we didn't.
There was a good article today in the WSJ: "Big Oil's Not-So_Big Growth Plans", which notes that even with high oil prices, major oil companies are buying back stocks, giving dividends, and not investing as much in hunting oil. That's because it's getting harder.... [I used to help design & sell supercomputers, to petroleum geologists, among others, around the world: Calgary, Denver, San Ramon, Houston, China, Australia, Malaysia, Abu Dhabi, Dubai, Bahrain, and Dhahran, etc.]
If one were selling fossil fuels, knowing that the Peak was here or soon, one would want people to stay as dependent as possible, i.e., keep investing in infrastructure and fleets that need it, so that as it gets more scare, the price goes higher, and it commands a bigger fraction of peoples' spending. Hence, it makes perfect sense for one to encourage others to avoid conservation and reworking of infrastructure to be less dependent in advance. I.e., we'll end up burning all the oil and gas that we can get, but the big variables are:
- how much gets left in the ground, which depends on the price
- as a result, how much profit accrues to fossil companies
Of course, coal companies (which are not near Peak) prefer that oil&gas get used up as fast as possible, and before people can build renewable infrastructure, since that will create terrific pressure for unsequestered coal and synfuels...
The Y2K problem was grossly overblown as can be seen from the following observations
(i) Many forward-looking calculations involving 2000 and later years were made in 1998 and 1999 when very few organizations were compliant.
(ii) Fiscal year 2000 began in April or July 1999 in many jurisdictions with hardly any problems
(iii) Compliance efforts in non-English speaking countries were very limited and there were no ill effects
(iv) Lots of small businesses were non-compliant and nothing happened that could not be dealt with by fix on failure
I worked for years at a bank as a systems analyst. The banking industry took Y2K seriously.
Exactly why it was obvious that it would be a blip. People were taking it seriously, and fixed what was simply a problem, like many other problems which need fixing and therefore get fixed. The people I think of as Y2Kers weren't the sensible ones, but the nuts who right up to the end were predicting the most fantastic gloom and doom.
Ah. Got it. Agreed (typed while sipping water from the tap, not from a bottle stored in the bunker).
Best,
D
I worked in Ford - by 1999, they were into serious Y2K work, and I was *not* in the IT side. Among other things, they deployed software to scan files on computers, looking for dates, to flag them for attention (my Excel workbooks were flagged like a large diplomatic conference).
John - I found a Y2K bug in Excel, back in 1997. In some PV formulae, using 2 digit date codes resulted in strange results, if the range of dates spanned 2000. Going to four digit dates, or resetting the whole set of dates forward into the 21st century solved that. I imagine that it was common practice to use four digit dates, in an corporate application.
Y2k was real. It was fixed, few/no problems arose. So media considers that it never existed in the first place.
Makes me a bit angry, but I coded well for a couple years so none of my work would have been a problem. I wish the world could look back at it and consider it as the success it was.
Y2K problem
as a programmer I dealt with accounting & billing.
as there was no way to add or subtract calendar dates for aging, we used Julian dates, these were number of days passed since January 1, 4713 BCE, no months, no years, no Y2K. Every programmer I know did the same. The Y2K thing came from the early date chip in IBM PC's. Didn't matter in database and accounting as Julian Date was used. Which is why the Chinese, who did nothing, had no problem. This was a political problem not a technical problem.
see more at:
http://aa.usno.navy.mil/data/docs/JulianDate.php
richCares: you may perhaps be a relatively younger programmer, and "every programmer you know" may not be a large enough sample to make these strong assertions.
When I started writing code in 1967, the Y2K problem had already existed for years, although nobody cared very much at that point. It definitely did *not* start with the IBM PC, but preceded it by 20-30 years, and was far more troublesome in mainframes, with legacy COBOL & PL/I software especially.
It's certainly better to keep Julian dates around internally, and of course, in many applications, one needs finer resolution, i.e., UNIX-like "Seconds since epoch moment" or better, although of coruse the 32-bit version suffers from the "2038-problem." Even then, one still has to convert external date <-> internal form, and there has often been screwed-up code there.
John Tirnan: "Much of what I have argued about L2's essential plausibility (and now ORB's, and the MoH total for excess mortality) cites corroborating evidence. This I tried to present to Munro, and he hostilely refused to listen. For example, 3.5 million or so displaced by the war. There is no war that has a ratio of less than 1:10 of deaths:displaced. Most wars have a much more narrow ratio, 1:3 or so. "
John, (or Tim), do you have a cite? I'd like to have that information.
John Mashey is correct - my DB & processing was still in COBOL, and integrated with C++ apps - calcs were done in J dates, but some processes kept track via 19xx. We tested that the differing processes used J dates and that the new COBOL processes could be handled by C++. All worked. Not a scam at all, but a lot of work.
Best,
D
Barry: [on Iraqi refugees](http://www.cbc.ca/world/story/2007/03/09/iraqi-refugees.html)
Barry,
After a very brief search, these five cases seem to demonstrate a ratio of displaced-to-dead of less than what now exists for Iraq, assuming a ratio there of 5-1, say, 3.5 million displaced to 700,000 dead.
Guatemala 110,000 dead; 250,000 displaced. Ratio: 2.5-1
Burundi 200,000 dead; 300,000 total displaced. Ratio: 1.5:1
Angola 1.5 million dead; 4 million displaced. Ratio: 2.7:1
Congo: 1.5-3 million estimated dead since 1994; 2.7 million displaced. Ratio: 2:1 at the high end
East Timor: 200,000 dead; 450,000 displaced. Ratio: 2.25:1
Sources: http://www.unhcr.org/cgi-bin/texis/vtx/publ/opendoc.pdf?id=4444afc42&tb… , https://www.cia.gov/cia/publications/factbook/fields/2194.html, various news and humanitarian relief organization reports, for displaced, and for mortality: http://www.prio.no/cscw/cross/battledeaths/data/Documentation%20I%20Upp…
Numbers are difficult to parse so there could easily be some errors in here, but I doubt any of these would climb above the 5:1 ratio implied for Iraq. I would add that many wars have displacement as an objective of one party or another. In Turkey, the state purposefully and forcibly evacuated 1 million Kurds. So, too, with Bosnia, El Salvador, and others. In these circumstances, the intention and capability to do so accounts for very high ratios. There has been sectarian "cleansing" of neighborhoods in Iraq, especially in Baghdad (hence, the peace of the graveyard), but not a state-run, large-scale ethnic cleansing as we've seen in many recent wars.
The number of displaced may be under-counted, too. UNHCR keeps the figures, and from this war the number they have is about 3-3.5 million. But many people have apparently left and are not being counted. But this is hard to quantify. The UNHCR figures make the point all the same.
One last nugget re Munro, sent to me by a reader---check out his article for NJ on evolution debates. He places intelligent design advocates at the center with evolution advocates on one side and creationists on the other.
I started programming in early 60's (well beore '67), initially in assembly then C, then C++. Y2K only a problem for non programmers or those that programmed in basic or were quite simplistic programmers. For those not using Julian, the early dbase system stored date as a string "19991202" again no Y2K problem. It was quite easy to display these in Calendar Date format. I did not know a single programmer that used 2 digit years ("98") in their programs other that the PC clock chip. Which is why there was no effect per the Y2K problem. ask the Chinese!
Thanks John, Tim.
richCares: sigh, regardless of your length of experience, you are continuing to assert that because (a) you knew no programmers that wrote code this way, that (b) none such existed. (a) may perfectly well be true, but not (a) implies (b).
(b) is certainly untrue. B ack when the Bell System existed,with 1M+ employees, with hundreds of major operations support systems and thousands of applications of various degrees of legacy-hood in the 1970s, there was plenty of code with this problem. I didn't write it, and knew only a small fraction of the programmers who did, but the problem certainly existed in large amounts of code, as was discovered as many standalone systems got linked together in the 1970s for tighter-coupled data interchange.
One more time: this problem did *NOT* start with PC chips, it was already there in COBOL & PL/I application code * IMS on IBM S/360s, and Exec8 on Univac 1100s, and older systems, long before microprocessors even existed. This mess really happened back when a 7.25MB IBM 2311 was a substantive disk drive, and people worried about every byte in a disk record.
I've visited China 6-8 times,starting in 1988, to talk with computing people, and *of course* they had less problems: in 1988, China had very few computers, and did not have masses of 1960s/1970s legacy commercial software.
show a few coding examples that used 2 digit date codes in business or accounting programs, because I have seen none. Instead of saying they did exist could you show show actual code snips (not in Basic but Cobol or C) I don't think you can. The early code included 4 digit years as well as time. My first accounting program that handled aging used Julian date and that was before 1967.
Ashton Tate Dbase circa 1960
dbase widely used in accounting programs, was a staple for consultants in the 60's.
date4format command
Usage: void date4format( const char *date, char *result, char *picture )
Description: date is formatted according to the date picture parameter picture and copied into
parameter result. date4format is guaranteed to be null terminated. The special
formatting characters are 'C' - Century, 'Y' - Year, 'M' - Month, and 'D' - Day. "19800130", If there
are more than two 'M' characters, a character representation of the month is returned.
C Y M D
19 98 12 30
C + Y = 4 digits (no Y2K)
date4cdow command
const char *date4cdow( const char *date )
Description: A pointer to the day of the week corresponding to date is returned.
Parameters:
date This is a pointer to a character array in the form "CCYYMMDD". (no Y2K)
there were 19 date commands in dbase, all were 4 digit years (no Y2K)
richcares: with all due respect to your memory, enterprise software of the 1960s and 1970s wasn't written in dBase, among other things, because Ashton-Tate wasn't even founded until 1980.
One more time: what *anyone* did right with Julian dates before 1967 is irrelevant to deciding whether or not lots of people didn't, because enough people didn't do it right that a lot of others had to clean it up later.
I don't have any COBOL code, I didn't save any FORTRAN or PL/I from those days, and I keep saying, by the time UNIX and C came along, people more commonly did it better, but that didn't dematerialize the existing legacy code. [Get out your copy of K&R and see if I'm mentioned in the Preface to First Edition.] TIRKS was written in the late 1960s /early 1970s, in S/360 BAL & PL/I (& now has some Java & other modern stuff, but as of 2 years ago, still had a lot of BAL), so code lasts longer than people think.
enough. killfile.
"I did not know a single programmer that used 2 digit years ("98") in their programs other that the PC clock chip. Which is why there was no effect per the Y2K problem. ask the Chinese!"
Well, back in the 80s when mainframe disk space cost a zillion a megabyte (and pc hard drives were nonexistent), shaving the expiration date on a database of a hundred million customers by two bytes each saved the company enough money to get you a real nice bonus.
richCares, dbase gave you the option of 2 or 4 digit years using the SET CENTURY ON command. When I worked with dBase we used 4 digit years, but someone in a different section used 2 digit. That same section allowed free text fields for entering crucial information, so they had many more problems in their databases than just the year, but that doesn't change the point.
Access95 also had a 2-digit to 4-digit conversion rule that could create problems. There was no reason that people using access95 in small specially made databases were going to be thinking about this problem, especially given the dearth of skilled database developers in the 90s. Back then a lot of organisations with very little money for development had to develop their own systems, and what now seems surprisingly amateurish would have been considered back then to be quite sophisticated and forward-thinking.
Maybe in principle it shuoldn't have been a problem, but in reality it was.
For example, the dBase Y2K faq makes the problems pretty clear:
http://64.132.211.166/Docs/Y2KFAQ.htm
This didn't exist in the 1960s either, richcares :)
So your claim to 1960s four-digit fame includes references to dbase, which didn't exist then, documented in ANSI C, which didn't exist then.
You were far ahead of your time, I must admit that!
dhogaza writes:
[[So your claim to 1960s four-digit fame includes references to dbase, which didn't exist then, documented in ANSI C, which didn't exist then.]]
dhogaza is making a common mistake here. The PC program called dBase didn't exist before the 1980s. But there was an earlier application for IBM mainframes and minis which was also called dbase (small 'b'). Back when I was looking for programming jobs in the '80s, it was very common to see ads for people who knew "MVS, CICS, IBM JCL, dbase."
Vulcan, the precursor of dBase, was used on the early C/PM machines in the 60's. dbase (small b) was based on vulcan and often called "dbase" C/PM was in wide use at the time, data was stored on 8" single sided discs.
early programmers knew how to properly handle dates, if you can remember the large number in the corner of accountants daily desk calendars, those were Julian dates.
the only way to handle date manipulations for aging or keeping track of inventory was the use of Julian dates. Those that did use 2 digit dates were usually not able to generate aging statements.
the dbase examples I gave were from an early Vulcan book dated 1964. which was later taken over by Ashton Tate. (in 1980) My memory lapsed bit when I called it "Ashton Tate", (I was born in 1936)
during the Y2k crises, I ran a chain of 10 branches owned by a Canadian company they said they will send a Y2K exert at $400.00 per branch to correct my problems.
I told them to buzz off. My error rate was less than the other branches that did the fix. Only one location (not mine) had a lot of errors, they were using a C/PM spread sheet (Precursor to excel?)
so any problems caused by Y2K were on poorly designed systems (luckily there were not a lot of them)
Vulcan, the precursor of dBase, was used on the early C/PM machines in the 60's. dbase (small b) was based on vulcan and often called "dbase" C/PM was in wide use at the time, data was stored on 8" single sided discs.
Vulcan, the precursor of dBase, was written by C. Wayne Ratliff at JPL in 1978, for the IMSAI 8080, an S-100 micro-computer which had an Intel 8080 CPU and first released in 1975. It was written in 8080 assembly language; the 8080 was released in 1974. Ratliff ported Vulcan to CP/M, the OS by Gary Kildall (RIP) first released by Digital Research in 1976. Vulcan was renamed to dBase after Ratliff did a deal with Ashton-Tate in 1980.
CP/M required a diskette drive; early CP/M machines used 8" diskettes as you recall. The first diskettes were 8" read-only units developed by IBM in the late 60s. The first read/write diskette was the hard-sectored 8" Memorex 650 in 1972. Remember all those holes?
There was an important IBM OS in the late 1960s called CP/CMS, which Kildall used and may have had something to do with the name of CP/M.
Regarding Y2K being a non-problem, one can still sometimes find websites which display the current year (in an obscure corner) as 19108. I personally found and fixed a variety of Y2K bugs at various times in the 1990s, some of which would certainly have caused serious problems for the systems of which they formed a part.
As for mea culpa, I personally wrote a great deal of business data-processing code in the early 1980s which stored dates with two-digit years, to save storage (the DG minis had removable 5MB and 10MB disk packs, in refrigerator-sized drives). Some of this code was in COBOL, some was in other business data processing languages (for instance, in a Data General 4G language whose name I forget). Like much other code written in the 60s, 70s, and 80s, the design lifetime of this code did not extend into the 21st century. Unlike much other code written in the 60s, 70s, and 80s, it is unlikely - for commercial reasons - that any of that code of mine was still running by the late 90s.
This would be the "Ashton Tate Dbase" that richcares references above? And C existed on these same IBM mainframes at the same time, complete with C++/ANSI C function prototypes?
It's not our fault of richcares cites a product made by a company founded in 1980.
If he didn't mean "Ashton Tate" he should not have said "Ashton Tate", and those of us who are reading what he says in good faith should't be dinged for doing so.
Date is wrong, and C didn't exist then.
The Y2K consultant came to correct all my errors, I asked what is the problem. When 2000 comes all your 1900 records will be confused with your 2000 records. I don't have anyy 1900 records on my computer, my company started
in 1985. Well then your 1985 records will be messed up in 2085. Sorry, but all records older than 5 years are archived then purged from my computer. Yah, but there are process machines that will hang up. I don't have any process machines. Bank teller machines will hang up. I don't have a bank teller machine besides I called my
banker, he said they archive then purge all records older than 10 years.
I paid his assesment fee (with regrets) and kicked him out. What a scam! Are all programers that dishonest!
(blatant advertisement, but relevant):
For computing folks, if you haven't visited the Computer History Museum, www.computerhistory.org, in Mountain View, CA, if you ever get a chance, do so.
We even have punched cards, paper tape, and slide rules. We have a working IBM 1620 and DEC PDP-1 [spacewar!] which we turn on now and then. We have Crays, and Gene Amdahl's first compute, replete with bullet holes, and lots more. So, if one likes *real* computer history, here it is.
I once took a gang of Stanford first-term freshman in the accelerated CS program through, i.e. very smart 18-year-olds who'd been hacking C++ for years, assembled their own PCs, etc. However, upon seeing a keypunch & punch cards:
Q: what's that?
A: a keypunch.
Q: what was it for?
A: punching these cards, it's how we wrote code in the old days.
Q: NO WAY DUDE! nobody would write code like that! That's crazy!
A: Wait till you see paper tape or plug-boards...
I've still got an Osborne, and a closet full of software and stuff. Worked last time I fired it up. A while ago.
richCares, can you square your Julian dates argument with the dBase Y2K faq i posted? It doesn't matter if your system is using Julian dates or not - what matters is that the data was converted into the wrong Julian date during data entry, and during data look-up the same problem exists. In a large dataset with dBase optimized for high-speed data entry (i.e. 2-digit years) this meant big problems. All the look up screens also may need to have been changed, or at the very least the config file would need to be changed. Hardly a big issue, but worth checking. My section was lucky, our dBase dates were stored in 8 character strings for use as indices, so no problem. But until 2000 they would have worked just as well as 6 character strings, and saved a tiny bit of space...
Anyone who has tried handling dates in Access in Australia knows as well that an OS-dependent dating system can throw up huge problems. One database I worked with had reams of code to handle this, and we had to introduce a special logon script to check that the date was right, just in case some silly user changed it the previous day.
It's all well and good if you have a well-designed database, in a functional operating system, with good planning and good hardware. Most organisations which chose to do complex things on computers in the 80s and early 90s didn't have any of those things, and I for one am glad that my organisation took it seriously - a lot of people would have been very angry if it hadn't.
looking backwards at 1980 in Autoweek magazine:
Gasoline rises over $1 US per gallon.
GM announces plans to push electric car project to have it on market by 1984, and to have 31 mpg fleet average by 1985.
Congress authorizes $25 billion for synthetic fuel research, planning to curb oil imports by 1987.
Department of Energy predicts 19 million electric and hybrid cars on road by 2000.
Newspaper headline predicts "gasoline surplus to grow".
Datsun actually offers 51 mpg car for sale, the B210MPG.
Also in 1980: Ronald Reagan elected president.