Database management systems
Historical notes on the database management system (DBMS) business. Related subjects include:
Recently I expressed doubts about Actian’s DBMS-conglomerate growth strategy. For context, perhaps I should review other DBMS vendors’ acquisition strategies in the past. Some — quite a few — worked out well; others — including many too minor to list — did not.
In the pre-relational days, it was common practice to buy products that hadn’t succeeded yet, and grow with them. Often these were programs written at enterprises, rather than third-party packages. Most of Cullinet’s product line, including its flagship DBMS IDMS, was came into the company that way. ADR, if memory serves, acquired the tiny vendor who created DATACOM/DB.
Then things slowed down. A Canadian insurance company oddly bought Computer Corporation of America, to utter non-success. (At least I got an investment banking finder’s fee on the deal.) Computer Associates, which did brilliantly in acquiring computer operations software, had a much rockier time with DBMS. It acquired Cullinet, Applied Data Research, and ASK/Ingres — among others — and didn’t have much growth or other joy with any of them.
Indeed, Ingres has been acquired three times, and hasn’t accomplished much for any of the acquirers (ASK, Computer Associates, Actian).
I used to think that Oracle’s acquisition of RDB provided key pieces of what became Oracle’s own extensibility technology. Andy Mendelsohn, however, disputed this vehemently — at least by his standards of vehemence — and his sources are better than mine. Rather, I now believe as I wrote in 2011:
… while Oracle’s track record with standalone DBMS acquisitions is admirable (DEC RDB, MySQL, etc.), Oracle’s track record of integrating DBMS acquisitions into the Oracle product itself is not so good. (Express? Essbase? The text product line? None of that has gone particularly well.)
Experiences were similar for some other relational DBMS pioneers. Read more
|Categories: Applied Data Research, ASK Computer Systems, Computer Associates, Cullinet, Database management systems, IBM, Informix, Ingres, Microsoft, Oracle, Sybase, Teradata||1 Comment|
I blogged a little last year about the rewards and challenges of combining professional services and software in a mature company’s business model. My main example was Oracle. But other examples from Oracle’s history might have been equally instructive. For example:
- Oracle started out doing what amounted to custom development for government (military/intelligence) clients.
- Even when Oracle said it had productized its software, the stuff didn’t work very well without services to get it running.
- Oracle and Ingres both got a huge fraction of their early revenue* from deals to port their software to various brands of hardware.** That’s a lot like professional services.
- Oracle’s huge Tools Group grew out of professional services, if I have the story straight. Indeed, its first product was written by later long-time group chief Sohaib Abbasi when he was a consultant.
Roland Bouman reminded us on Twitter of an old post I did on another blog about Ingres history, the guts of which was:
Ingres and Oracle were developed around the same time, in rapidly-growing startup companies. Ingres generally was the better-featured product, moving a little earlier than Oracle into application development tools, distributed databases, etc., whereas Oracle seems to be ahead on the most important attributes, such as SQL compatibility — Oracle always used IBM’s suggested standard of SQL, while Ingres at first used the arguably superior Quel from the INGRES research project. Oracle eventually pulled ahead on superior/more aggressive sales and marketing.
Then in the 1990s, Ingres just missed the DBMS architecture boat. Oracle, Informix, Microsoft, and IBM all came out with completely new products, based respectively on Oracle + Rdb, Informix + a joint Ingres/Sequent research project, Sybase, and mainframe DB2. Ingres’s analogous effort basically floundered, in no small part because they made the pound-wise, penny-foolish decision to walk away from a joint venture research product they’d undertaken with innovative minicomputer vendor Sequent in the Portland, OR area.
Computer Associates bought Ingres in mid-1994, and immediately brought me in to do a detailed strategic evaluation. (Charles Wang telephoned the day the acquisition closed, in one of the more surprising phone calls I’ve ever gotten, but I digress … Anyhow, the relevant NDA agreements, legal and moral alike, have long since expired.) There was nothing terribly wrong with the product, but unfortunately there was nothing terribly right either. Aggressive investment — e.g., to get fully competitive in parallelism and object/relational functionality, the two biggest competitive differentiators in those days — would have been no guarantee of renewed market success.
Notwithstanding the economic question marks, CA surprised me with its enthusiasm for taking on these technical challenges. But another problem reared its head — almost all the core developers left the company. (If you weren’t willing to sign a noncompete agreement that was utterly ridiculous in those days, at least in the hot Northern California market, you couldn’t keep your job post-merger.) And so, like almost all CA acquisitions outside of the system management/security/data center areas, Ingres fell further and further behind the competition.
Some of the same information made it into my post here on Ingres history later the same year, but for some reason not all did.
Talking to Algebraix reminded me that David Childs is still alive and kicking. I only ever encountered Childs once, in the early/mid-1980s, when he was pushing his company Set Theoretic Information Systems. The main customer example for STIS was General Motors, for which he had achieved a remarkable amount of database compression. It was something like 4-5X, if I recall correctly, but for 1983 or whatever that was pretty darned good. The idea was to replace data by partitioning according to shared values. E.g., you didn’t store whether cars were red, blue, or green; instead, you stored records about all the red cars in one place, the blue cars in another, and so on. There was also some set-theoretic mumbo-jumbo, but I never figured out what it had to do with implementing anything.
Comshare — a BI vendor before anybody called it BI — did actually build a DBMS based on Childs’ ideas, as Ron Jeffries reminds us. It was relational. Eventually, if I recall correctly, it was swapped out for Essbase (the original MOLAP product, now owned by Oracle).
What Childs really focuses on, however, seems to be “Extended Set Theory.” (This was brought to my attention by Algebraix, even though Algebraix doesn’t actually use many of Childs’ ideas.) And he’s been doing it for a long time. Way back in 1968, Childs wrote a paper outlining how set theory, relations, and tuples could be applied to data management.
And that’s where I did a double-take, because 1968 < 1970. Sure enough, Footnote #1 in Codd’s seminal paper is to Childs’ 1968 work. Indeed, Childs’ paper is the only predecessor Codd acknowledges as having significant portions of his idea.
I’m far from convinced that “Extended set theory” has much to offer versus the standard relational model. But that debate quite aside — Childs’ original achievement doesn’t get the credit it deserves.
The top PostgreSQL-related April Fool’s joke this year, which seems to have successfully pranked at least a few people, was that Postgres is dropping SQL in favor of an alternative language QUEL.
Folks, QUEL was the original language for Postgres. And Ingres. And, more or less, Teradata.* I’d guess Britton-Lee too, but I don’t recall for sure.
*Once upon a distant time, when I was a cocky young stock analyst, I explained to Phil Neches, chief scientist of Teradata, just why it was a really good business idea to drop T-QUEL for SQL. I doubt he was convinced quite on that day, more’s the pity.
On April Fool’s Day, it is traditional to spread false stories that you hope will sound true. Last year, however, I decided to do the opposite – I posted some true stories that, at least for a moment, sounded implausible or false. This year I’m going to try to turn the idea into a kind of blog-tagging meme.*
*A blog-tagging meme is, in essence, an internet chain letter without the noxious elements.
Without further ado, the Rules of the No-Fooling Meme are:
Rule 1: Post on your blog 1 or more surprisingly true things about you,* plus their explanations. I’m starting off with 10, but it’s OK to be a lot less wordy than I’m being. I suggest the following format:
- A noteworthy capsule sentence. (Example: “I was not of mortal woman born.”)
- A perfectly reasonable explanation. (Example: “I was untimely ripped from my mother’s womb. In modern parlance, she had a C-section.”)
*If you want to relax the “about you” part, that’s fine too.
Rule 2: Link back to this post. That explains what you’re doing.
Rule 3: Drop a link to your post into the comment thread. That will let people who check here know that you’ve contributed too.
Rule 4: Ping 1 or more other people encouraging them to join in the meme with posts of their own.
Hopefully, the end result of all this will be that we all know each other just a little bit better! And hopefully we’ll preserve some cool stories as well.
To kick it off, here are my entries. (Please pardon any implied boastfulness; a certain combustibility aside, I’ve lived a pretty fortunate life.)
I was physically evicted by hotel security from a DBMS vendor’s product announcement venue. It was the Plaza Hotel in NYC, at Cullinet’s IDMS/R announcement. Phil Cooper, then Cullinet’s marketing VP, blocked my entrance to the ballroom for the main event, and then called hotel security to have me removed from the premises.
A few years later, the same Phil Cooper stood me up for a breakfast meeting in his own house in Wellesley. When one’s around Phil Cooper, weird things just naturally happen. Read more
In case you missed it, I’ve had a couple of recent conversations about the TPC-H benchmark. Some people suggest that, while almost untethered from real-world computing, TPC-Hs inspire real world product improvements. Richard Gostanian even offered a specific example of same — Solaris-specific optimizations for the ParAccel Analytic Database.
That thrilling advance notwithstanding, I’m not aware of much practical significance to any TPC-H-related DBMS product development. But multiple people this week have reminded me this week the TPC-A and TPC-B played a much greater role spurring product development in the 1990s. And I indeed advised clients in those days that they’d better get their TPC results up to snuff, because they’d be at severe competitive disadvantage until they did.
It’s tough to be precise about examples, because few vendors will admit they developed important features just to boost their benchmark scores. But it wasn’t just TPCs — I recall marketing wars around specific features (row-level locking, nested subquery) or trade-press benchmarks (PC World?) as much as around actual TPC benchmarks. Indeed, Oracle had an internal policy called WAR, which stood for Win All Reviews; trade press benchmarks were just a subcase of that.
And then there’s Dave DeWitt’s take. Dave told me yesterday at SIGMOD that it’s unfortunate Jim Gray-inspired debit/credit TPCs won out over the Wisconsin benchmarks, because that led the industry down the path of focusing on OLTP at the expense of decision support/data warehousing. Whether or not the causality is as strict as Dave was suggesting, it’s hard to dispute that mainstream DBMS met or exceeded almost all users’ OTLP performance needs by early in his millenium. And it’s equally hard to dispute that those systems* performance on analytic workloads, as of last year, still needed a great deal of improvement.
I meant to put up a longer post some months back, reproducing some of the 25th anniversary DB2 history IBM provided, courtesy of Jeff Jones and his team. Seems I didn’t get around to it. Maybe later.
The idea of specialized hardware for running database management systems has been around for a long time. For example, in the late 1970s, UK national champion computer hardware maker ICL offered a “Content-Addressable Data Store” (or something like that), based on Cullinane’s CODASYL database management system IDMS. EDIT: See corrections in the comment thread. (My PaineWebber colleague Steve Smith had actually sold – or at least attempted to sell – that product, and provided useful support when Cullinane complained to my management about my DBMS market conclusions.) But for all practical purposes, the first two significant “database machine” vendors were Britton-Lee and Teradata. And since Britton-Lee eventually sold out to Teradata (after a brief name change to ShareBase), Teradata is entitled to whatever historical glory accrues from having innovated the database management appliance category.
Wikipedia’s current article on Cullinet is long, detail-laden, and slanted. The difficulties are not of the sort to be fixed with my usual pinpoint Wikipedia edits. So I’ll just reproduce it here, commenting as I go. As for copyright — this particular post is as GPLed as it needs to be to comply with Wikipedia’s copyleft rules. All other rights remain reserved.
The company was originally started by John Cullinane and Larry English in 1968 as Cullinane Corporation. Their idea was to sell pre-packaged software to mainframe users, which was at that time a new concept in an era when enterprises only used internally developed applications or the software that came bundled with the hardware.
|Categories: Application software, Companies and products, Computer Associates, Cullinet, Database management systems, Industry sectors, Pre-relational era||8 Comments|