2018/12/16

Why Merging Software Products is Harder Than Expected

Recently a friend of mine told the story of in a small company that was acquired by a larger company with a competing product. The idea being "to take the best parts of both products" and make something better. Ten years later, that had not happened, and in fact technical issues plus tensions between the two product teams, pretty much ensured it never would.

At one of my previous companies, something similar unfolded. The company had two tiers of products for the same market. One was high-end and very expensive, the other was the lower-cost alternative. Both were very mature products and as you might expect, over time functionality converged a bit, as the high-end product extended into features suitable for smaller customers, while the low-end product expanded its offering. Each product had its own strengths and weaknesses.

Management noticed that and said, "hey, these two products do a lot of the same things, why not combine them and get the best of both worlds?" They filled a chart board with post-its showing the commonalities between the products. Each post-it contained words like "generates reports", and "computes portfolio balances". Never mind that each post-it encapsulated big piles of bespoke code that had only ever talked to post-its within the same product.

 This is a bit like some big manufacturer saying,"we build trains, and we build buses, and they both do similar things -- moving people and stuff around. The great thing about a bus is that it can go anywhere there is a street. Trains can only go where the tracks have been laid, but they have a much greater capacity, better energy efficiency, and can be customized out of modular components. What if we take the best of a bus and the best of a train, and put them together? We change out the wheels and we have a tra-bus that can go anywhere! And the bus windows are much nicer, they're bigger and people can open them. Let's put those on."

And so it goes, until you have an incredibly ugly train-like thing built on a bus frame, that isn't street legal, that can't use existing modules (railcars) and now it needs more engines and better brakes, and so on.

As incredibly bad as this idea is in (vehicular) hardware, it's even worse in software; with these bu-train monstrosities, the problems of connecting the disparate parts are physical and easy to visualize.
In software, its too easy to lose sight of how parts fit together, and the possibilities are unbounded (blocking API, async API, restful API, redirected I/O, mapped memory, named pipes, sockets, protocols built on sockets, databases, data files, etc). Also, the parts can be written in entirely different languages that have trouble talking to each other. It doesn't even have to be different languages to be hard. Combining C++ written for Windows with C++ written for any other operating system can be insanely difficult. The challenges of integrating software products are enormous, and easy to grossly underestimate if you just break the product down into a bunch of functional boxes scribbled on post-its and say "hey, most of these boxes do the same thing!" If you see management doing something like that, its either time to speak up, or time to update your LinkedIn profile.

Inverting the emulation stack as a thought exercise

The very first processor I worked on as a software professional was the Motorola 6800, an 8-bit processor that could access 64K of RAM with a clock rate of 1-Mhz. The machines I worked on had less memory, and everything was done in assembly (of course).

Today I'm working on a Windows machine with a 64-bit, 4 core processor clocking 2.8 Ghz, and accessing... well, its Moore's Law in action.

Would it be possible to write a AMD64 emulator on the 6800? I don't think it can be done within the RAM of the processor. Given the complexity of a modern core, more code may be needed to emulate the instruction set than will fit in 64K. Restricting the code to a subset (off-boarding floating point, for example) might help it squeak by, and we would not implement many internal features such as instruction reordering or prefetch.

So.. if it takes a couple hundred 6800 instructions to emulate one x64 instruction (on average) and the 6800 runs at 2 MHz, and emulation doesn't require any I/O to a file system, then the emulated x64 would be executing about 5K x64 instructions a second, or "only" a 400K to 1 slowdown.

2018/10/15

Science vs Economics

An important aspect of good computer science is understanding the root causes for bad behavior in a system. Without a deep understanding, hacking around the problem often leads to bigger problems down the road. The original problem was not addressed, and the workaround can easily introduce unintended side effects or new errors.

 Similar rigor is true for hard sciences like biology, chemistry, and physics. Einstein's famous theorems could be seen as an evaluation of the root causes for physical phenomena that didn't agree with classical physics.

This doesn't happen with economics. Case in point: economists point to the start of the Great Recession to 2008, when the acceleration in mortgage failures could no longer be ignored. However, the root causes for those failures can be traced much further back, like the repeal of Glass-Stiegal in the 1990s, Greenspun's lackadaisical and obfuscatory management of the Fed, the lack of regulation of the mortgage industry that followed, corruption across the board in the finance industruy, including regulators, the de-fanging of the STC and so on.

The failure to dig into root causes is one of the things that constrains the utility of economics (pun intended). Another is its dependence on the notion of people as rational actors, but at least this problem is being addressed to some degree by the field of behavioral economics. But perhaps this is optimistic.

Illustrating this blindness to root causes is an interview of Neel Kashkari, where he says he bought a house in California in 2005, and "didn't see it coming." I found this stunning. I was shopping for a home in California in 2005, and everything was screaming "this is insanity!" Ads on the radio for insane loan offers, home prices outstripping every other economic indicator,  a dentist I met who was doing interest-only loans and laddering up into ever-more expensive zipcodes, and who thought he was a genius. It. Was. Nuts. The writing was on the wall, in wall-tall letters.

The best part, the truly best part is that guys like Kashkari take credit for turning things around, but did they really do anything to improve the odds of against future financial disasters? Like, some perp walks for the greedheads who almost destroyed the world's economy? Or a serious restructuring of the financial industry to counter the centralization of wealth and power? Ironic scornful laughter ensues.