Thank you, Jonathan! Personally, I don't like the term guru or expert, usually they're good at talking rather than doing - I once turned down a consultancy job in favour of a project management one because I preferred and got satisfaction from delivering systems rather than telling people how they should do it - and I know of one consultancy that forgot to tell their client they should do a critical path analysis for a complex programme; of course, it failed :-( But, Y2K! I remember working on a system in the 1970s and remarking to a colleague that it wouldn't work after the Year 2000 (but all those extra 2 bytes we saved for "19" on an 8" floppy diskette were valuable) and joking that we should setup a software house to do conversations in the late 1990s. Of course, we didn’t, mainly because application software became big business over the decades rather than the "cottage industry of the early 70s (and we'd moved on from development as well).There were risks involved. The level of that risk would depend on the age of software and how well maintained and supported it had been over the years. John Hawke's example of IBM probably reflects IBM's professionalism and level of investment, and what they charged customers. (Anyone remember the days of no one ever being fired for buying IBM. And if you didn't buy IBM, the IBM salesman would phone you're boss and tell them what a bad decision you'd made!)Most effort probably went into reviewing and testing software to ensure it wasn't vulnerable. I know where I worked at the time we had to modify some software to ensue it wouldn't fail and my understanding was that elsewhere a number of old COBOL programs still in use by some banks did need significant modifications. If one was running a standard product such as SAP or Oracle ERP one would expect no Y2K issues but if one had a bespoke system of a heavily modified applications that was ageing there was more likelihood it could be problematic. I know someone running a small business using old software who reset the base date of the system to a year in the 1980s as the day-month-year pattern matched that of 2000 onwards - the application only used two digit year fields and would fail after yr2000 as the business couldn't support new software or the costs of modifications. (I think also the source code wasn't available to make changes.) Printed output had to be manually modified or retyped with the correct year :-)The systems I administered correctly supported years past 2000 internally but a lot of screen displays, printouts, etc only used two digits and they had to be verified to ensure, for example, CGT dates were showing from, say, 2001 not 1901 (the internal calculated value would be correct, only the printed/displayed dates would be incorrect).I seem to also recall the financial regulator wanted proof financial institutions were taking it seriously, and, if I remember correctly, auditors - certainly of PLCs - ensured one had adequate Y2K plans in place; and the Big4 would have Y2K consultants who could help ... yes, John's correct; consultants would see an opportunity and some would be less scrupulous than other. That was the same with the GDPR, loads jumping on the bandwagon, many giving poor and erroneous advice (best choose advice from a lawyer who's won one or more cases against the ICO or takes a pragmatic approach suitable for one's organisation). However, in my opinion Y2K wasn't a problem for the reason businesses were aware of it - their staff had written systems that would fail on 1st Jan 2000 - and took appropriate action - and, yes, some people made lots of money out of it - consultants and ageing COBOL programmers who postponed retirement :-) But, then, what would have been the cost of banks not processing payments, air traffic control grounding flights, or thousands of PCs "blue screening" - oh, wait, didn't TSB, NATS, Crowdstrike et al demonstrate that - but fortunately not all on the same day :-)
Michael Ixer ● 68d