This page lists some of the damage I caused.
Like many programmers, I started with video games. I'm still coding little games from time to time.
Alpha-Waves was the first 3D platform game ever. I started working on that game in 1988, and it was published by Infogrames in 1990. It was a puzzle/labyrinth game where you jumped from platform to platform to reach doors that would give you access to the next room. It was also in my opinion the first truly immersive 3D game, in the sense that it was the first one to combine true 6-axis 3D, some kind of physics engine, and interaction with a large number of objects.
As an illustration for the capabilities of HPDS, I wrote a few games. The first one was a Pac Man clone, released initially as an assembly-only program with HPDS 1.0, and later converted to an HP-48 library with HPDS 2.0 in 1991. According to Paul Courbis, this was the first game to use the hardware scroll of the HP-48 calculator, making the game very fluid. I don't recall exactly when the game was created, but I would say around 1990, shortly after the introduction of the HP48 (HPDS had been developed earlier, for the HP28).
Being one of the very first games for the HP-48, PacMan quickly became very popular, taking advantage of the HP48 built-in infrared networking to spread virally. Hopping from machine to machine, it spread to other countries, including the United States. I received letters from all over the world.
Various "patches" were made by other programmers, who sometimes added real value like new levels or making it compatible with the HP48GX, but sometimes did little more than put their names in the credits. Various other authors are still incorrectly credited for the game on HPCalc.org today. Despite the availability of the source code as part of HPDS, most of these patches were actually made by manipulating the binary, sometimes introducing bugs. Some letters I received complained about bugs that were introduced by these patches.
Another demo game for HPDS, also written in 1991, was a small Tetris clone. Like PacMan, this game was intended primarily as a kind of tutorial and demo for what could be done with a cross-development tool for HP48. It was not as successful as PacMan, because there were already dozen of Tetris clones on the HP48 market. I am not positive that the picture here is of my Tetris, it might be another one.
As HPDS became more sophisticated, I decided to use it to write more complex HP48 games. Version 3.0 of HPDS included a clone of Lemmings, complete with an editor to create new levels. This particular source code is dated from 1993. It was pushing the limits of the calculator, in particular in terms of memory, but also in the number of lemmings I would animate on screen with a meager 2MHz 4-bit CPU...
Like for PacMan, other programmers contributed "patches", adding their name to the credits. But in these wild times, very strange things happened.
In the years following the release of HPDS, I regularly received letters from people requesting the programs I had written. And then one day, I received a letter asking for my PacMan, my Tetris, my Lemmings and my Space Invaders. So I responded, sending back the first three programs, and explained I had not written a Space Invaders. Then a second letter asking for Space Invaders. Then a third one.
I thought little of it, until a friend, who had a large collection of HP48 programs, told me "Your Space Invaders is really not as good as your other games". I replied I had never written a Space Invaders, so he showed it to me. And indeed, someone had written a Space Invaders and put my name in the credits! I'm still hoping that this was a kind of hommage to my other games... I never knew who the real author was.
This Breakout is not a true game but a demo for Tao3D. The idea was to create a real game with very little code, in order to show kids and students the basics of programming. Tao3D makes it easy to create really short code for animations.
This serious game developed for Thales using Tao3D is probably the largest applications written with the language to date. This game combines interactive presentation of Thales Underwater Systems' sonar offering, followed by a simulation of sonar operations.
Firefly is an work in progress experiment to create a game entirely using ray-marching. Raymarching is a highly detailed rendering technique which can compute realistic shadows or lighting on scenes containing a very large number of visible elements. Raymarching can now be done in real-time, if barely, on a GPU. An interesting idea in Firefly is to compute collisions and physics using the same distance field as for the rendering.
For three decades, I have been doing some research in programming languages, to help programmers write better code faster. Here are my various experiments, in roughly chronological order.
My first published program was a BASIC extension for the Sinclair ZX Spectrum, which gave it a range of graphic features such as: windows, ellipses, clipping, flood fill (including with 8x8 patterns)... It was distributed by Valric-Laurene. It was a 3.5KB program written in Z80 assembly language.
I was unlucky enough to call the program Spectrum+, just one month before Sinclair launched a computer by the same name... In any case, I'm afraid this program was forever lost in the mist of time, like the second program I wrote for Valric-Laurene, a BASIC patch for the French Sinclair QL.
SATAS was an assembler for the HP-28C, HP-28S and HP-48SX calculators. It later evolved into the "HP Development System" or HPDS, which allowed you to create any kind of program for the HP calculators, not just assembly-language programs. It was used to create a number of games for these calculators. I lost the source for HPDS, although I believe it was published on-line at some point, but I can't remember where. Curiously, I still have the source for the games along with a now useless MacOS7 binary for HPDS 3.0.
A training period at Alsys SA (Ada compilers) in 1988 led me to reflect about what I liked in Ada and what I did not like. I began laying the foundations of LX, short for langage expérimental (experimental language in French). I worked on LX between 1988 and 1998 without publishing anything or keeping a good track record. I rewrote the compiler from scratch several times as new ideas came up. Earlier LX compilers generated MC68000 assembly language. I later switched to generating C for portability reasons, using code generation tables for configuration.
I ultimately published LX as free software in early 2000. By that time, LX already had a user-accessible representation for the parse tree and basic garbage collection. Despite never becoming feature complete, it also had some pretty advanced capabilities, like generics (similar to C++ templates).
In 1998, I joined the Hewlett-Packard Compiler and Languages Lab (CLL) in Cupertino, California, to work on the HP aC++ compiler. I was also supposed to work on my own language, but that quickly tanked, prompting me to go open-source.
The aC++ code base came from Taligent, and was both advanced in a few aspects, and really backwards in others (one function was 1700 lines of code). On the compiler, I initially worked on templates, performance, the run-time library and precompiled headers.
Later, I implemented exception handling and I was one of the initiators of the C++ ABI effort. I also represented HP at the C++ standard committee for a short period. Finally, I built the first native Itanium C and C++ compilers (the HP plan at the time was to run PA-RISC cross compilers in a PA-RISC emulator...). This required coordinating the work of 16 different teams with 16 different ways of writing makefiles...
During discussions with other engineers at Hewlett-Packard, we kept talking about a "program database", a way to store all information coming from the front-ends for different languages. The idea was to offer thin tools, small language extensions that would work for all languages, irrespective of their syntax. Use cases included: renaming a function and all calls to it; implementing symbolic differentiation; performing high-level optimizations; etc.
I started implementing this, initially under the name "Xroma", coined by Daveed Vandevoorde, and then renamed it Mozart after Daveed left HP and told me he wanted to keep the name Xroma for his own projects. Mozart saw the appearance of the first variant of the "lightbulb" logo.
Following a number of very interesting discussions notably with Daveed Vandevoorde, the focus of LX changed from "experimental" to "extensible". The LX language was renamed XL, and re-implemented on top of Mozart (which, in turn, was largely reusing code from LX).
The XL language inherited advanced generics from LX, but at this point, it was still a language with many keywords and a standard syntax-directed parser.
Besides XL, Moka was the first (and only) language implemented on top of Mozart. It was a relatively complete Java parser. With the help of Moka, it made it possible to implement "thin tools" manipulating Java code, for example to automatically generate boilerplate code, or to perform symbolic differentiation.
Moka quickly demonstrated the limits of the approach in Mozart. The parse tree structure was modelled after the syntax of languages such as Java or C++, so it had dozens of high-level node types, e.g. "IfThenElse" or "FunctionDeclaration". Subtle differences between languages made it very complicated to write a useful plug-in that would actually work with multiple languages.
XL2 was a complete re-implementation of XL using a much simpler parse tree structure, still used to this day in XLR, Tao3D and ELFE, which consists of only 8 node types, and makes meta-programming extremely easy.
XL2 was also the only implementation of any of my languages where I tried to write the XL2 compiler in XL2, using a three-steps bootstrap. A first stage was written in C++, and was used to translate a "bootstrap" compiler, which itself was used to build the "native" compiler. That final compiler had advanced features like true generic types. It also featured compiler plug-ins to extend the language, for example to implement symbolic differentiation.
XLR was initially designed as a back-end for XL2 that would enable dynamic compilation using LLVM. I tried to make the simplest language using the XL parse tree I could think of, largely taking inspiration from Pure. The language quickly took a life of its own. I found it so elegant and simple I quickly stopped working on XL2.
Tao3D is a programming language for interactive 3D, derived from XLR. It makes it easy to create great visualisation and animations, very quickly. It is powerful enough to present complex information in an entertaining and interactive way. Tao3D can be used for talks, trade shows, museums, education, and more. A commercial version distributed by Taodyne, a company I co-founded, adds support for stereoscopy, auto-stereoscopy, encrypted documents, and a few other features.
ELFE is a very simple and small programming language for everyday programming. While it is a general purpose language, it is specifcally tuned to facilitate the configuration and control of swarms of small devices such as sensors or actuators. It can also be used as a powerful, remotely-accessible extension language for larger applications. ELFE is a trimmed-down version of XL that can run without LLVM. It is also the first language I released to incorporate a type system I designed in late 2012, based on tree shapes. In many ways, ELFE is the best implementation of XL to date, and is likely to take over.
ELFE used to be called ELIoT (Extensible Language for the Internet of Things), but Legrand complained that they own the trademark, so it was renamed.
HP ECUTEST was a real-time test system for car electronics, in particular engine control units (ECU). It was capable of generating or monitoring several hundred signals (about 400 in a typical configuration), using a wide variety of analog or digital devices. It featured sub-millisecond closed loop (250us in some configurations) and continuous data logging with multiple time bases, e.g. some based on time, some based on engine rotation.
The ECUTEST operating system was doing heterogeneous multiprocessing, combining an x86 Windows PC for command, control and data visualization with several real-time boards (68040 running pSOS or PA-RISC with HP-RT). Later systems added a number of VME or VXI devices with their own CPUs (68040 or 68332). The device driver model was distributed, in the sense that configuration was made by the controlling PC, then real-time measurements were done on the 68040/PA-RISC, and then the PC did the actual disk I/O for data logging. ECUTEST also included a hardware-agnostic binary translator, letting customers specify equations that were computed every cycle.
Exception handling is a mechanism used in languages such as C++ to deal with run-time errors. Traditional implementations were very hostile to optimizations, which was a real problem for a processor such as Itanium where compiler optimizations were essential for performance.
Brighter minds at Hewlett-Packard had designed an approach which still allowed optimizations. I was tasked with implementing this design in the C++ front-end and in the support library. I then documented the way it worked in an article. This approach remains the way exception handling is done in most compilers to date.
Following the work on C++ exception handling, I asked Cary Coutant, a brilliant HP engineer who was responsible for the C ABI, if there was a way to make a common ABI for C++ on Itanium. He quickly organized the meetings, and Daveed Vandevoorde and myself began meeting C++ engineers from many companies.
The result was a common C++ ABI (application binary interface) explaining how C++ programs must represent data and programs on Itanium computers. But since in practice, this was the first time there was a standard-compliant C++ ABI, this work was soon adopted by compiler vendors for other platforms, starting I believe with GCC, under the impulse of Mark Mitchell who also quickly became the primary maintainer of the ABI documentation.
So while it is still called the Itanium C++ ABI, in practice, it has been adapted on many platforms, and is still maintained to this day for new revisions of the C++ standard.
The HP DE200C was Hewlett-Packard's attempt at entering the market for digital entertainment appliances. It looked and behaved like a CD player, but inside, it was a PC running Linux. It could play CDs, MP3 files, and selected Internet radios through an HP service. It flopped, largely due to the very high price tag. So HP dumped the remaining units for $150 apiece, which was a very good price for a very silent and nice looking, albeit underpowered PC.
I turned mine into a complete home-entertainment solution based on MythTV. I replaced the built-in CD drive with a DVD drive, added more memory and a TV Tuner card, installed a Gentoo Linux and MythTV and started tinkering. The largest piece of software I wrote was a client/server driver for the front panel vacuum flurescent display (VFD). This design allowed multiple small Linux apps to display in turn on the VFD, including using LCDproc. I also created a few patches for the Linux kernel to deal with the remote control and front-panel keys (which appeared as some kind of non-standard keyboard).
HP Integrity Virtual Machines (HPVM) is a virtualization solution for Itanium-based servers. I started this project in late 2000 and left the team in early 2010, making it the project on which I spent the most time in my professional life. It was also the most complex one, being en enteprise-grade virtualization solution for Itanium processors (not the simplest CPU architecture to virtualize).
For this project, I was fortunate to get the early help of two HP system software gurus, Todd Kjos and Jonathan Ross. The three of us got the software to the point of booting 4-way HP-UX guests. Then, we got a much larger team, mostly composed of former Tru64 and OpenVMS wizards who helped us turn this prototype into a real product. The team was constituted of remarkably bright people, some who had been working on operating systems such as TOPS-10, TOPS-20 or OpenVMS long before I even knew what a calculator was.
In addition to the early design of HPVM, I contributed to many facets of the product: low-level interrupt handlers; binary translation; memory management; interrupt handling; interrupt emulation; context switching; networking and disk I/O; debugging tools; scheduling; scalability; testing; user-space management tools; and I probably forget a few. That was a lot of fun, many interesting problems, and interaction with a number of super-smart people, like Karen Noel, who besides doing most of the work for on-line guest migration, also tried (and failed) to improve my OpenVMS and English skills.
HPVM uses HP-UX as its "management console", but also inherited its remarkable I/O capabilities. I called the context switch between the HPVM monitor and HP-UX "transmogrification" (abbreviated Xmog), a reference to Calvin and Hobbes, hat tip from a french comics lover to Bill Watterson.
HPVM offered very good scalablity thanks in large part to scheduling work done by Scott Rhine. It also featured high-level management tools, para-virtualized I/Os, on-line guest migration, on-line addition and removal of devices, RAM and even CPUs, cluster integration, and many more things that are just starting to show up in open-source virtualization tools.
I feel really bad for not naming all the individuals who contributed to HPVM, but they know who they are, and they know I remember them all fondly.
I also wrote a few regular applications that were neither system software, nor programming languages, nor games.
µSolver was the first graphical application I wrote for the Atari ST probably in 1986-1987. It was essentially a clone of TK!Solver, a tool to solve multiple non-linear equations. I had created it mostly as a way to validate that I had figured out how TK!Solver worked. Compared to TK!Solver, µSolver added complex numbers and more advanced graphing capabilities. And of course, it ran on the less expensive Atari ST instead of the Mac.
µSolver was written in GFA Basic, a structured variant of BASIC. It worked well in interpreted mode, but the GFA BASIC compiler was buggy, and I could never produce a stable compiled version. That obviously prevented me from ever turning it into a commercial product.
One day, I showed µSolver to an older student. I was very proud of having invented a way to solve multiple non-linear equations. Then the student said something like "Oh, I see, you used the Newton-Raphson algorithm". I was really bummed my algorithm, which I thought was really cool, had actually been described centuries earlier by Newton.
I still have a floppy which theoretically has µSolver on it, but no drive or machine to read it, and I don't know the chances to read a disk that is 25 years old. So I guess it's been lost.
CapaVoit (Capacité de Voitures) was my first Mac application. I created it while working for Axlog Ingénierie in Paris. It was a tool designed to help design cars for subway trains, with a particular focus on capacity planning and rapid prototyping.
In practice, CapaVoit was basically a MacDraw clone with a built-in numerical solver similar to that in µSolver, enhanced to deal with inequations. The first release was not very successful with the customer, because very often, the solver would override what had been drawn with a perfectly acceptable solution containing mostly zeroes. Release 1.5 fixed this issue and addressed many other customer feedbacks.
I am an avid Emacs user. So when Apple released OSX with a text-only version of Emacs (one that worked only in the Terminal app), I was disappointed. I waited a couple of months, thinking that someone else would fix this. But it did not happen. So I took an old NeXT port, adapted it to the Aqua graphical interface, added a few OSX-only niceties like transparent windows, and published it on SourceForge.
This port is now obsolete. By the time Emacs 23 was out, there were several forks and alternative implementations, and soon, maintaining a separate branch of Emacs became unnecessary, since the primary version had integrated native graphical support for MacOSX.
Tao3D Studio (formerly known as Tao Presentations) is an application to create 3D presentations using the Tao3D language. Tao3D Player is an application playing back Tao3D documents. They are developed and sold by Taodyne, a company I co-founded in 2011.
Both applications share most of their source code and features the open source Tao3D language. The commercial versions add support for stereoscopic and auto-stereoscopic displays (3D with and without glasses), as well as the ability to sign or crypt documents.
As an amateur physicist, I've long been very interested by quantum mechanics and relativity.
The standard way to teach special relativity still uses the Lorenz and Einstein formalism as written in 1905. For example, relativity teaches us that if two spaceships cross at high speed, each will see clocks in the other spaceship beating slower than its own clocks. In high school, I found that shocking, since if A was smaller than B, then B could not also be smaller than A. But the best answer my physics teacher could give me was: "That's what the computation says". That answer did not satisfy me at all.
So I kept working on special relativity until I finally got it. And I finally understood that special relativity is just the regular laws of perspective, but in a space-time where you need to put a minus side in the Pythagorean theorem. To me, that was like a revelation, and I needed to share it with the world.
It did not make much of a difference. Most people who really know relativity also know that Lorenz transform is a rotation (called Wick rotation). And yet, they still use Einstein's formulas to teach, which totally obfuscates the physical meaning, leaving non-scientists dubious at best, and preventing students to use their intution to verify their equations.
With special relativity, Einstein proposed that we give up the idea that there is a central reference coordinate systems that is superior to the others (the aether) to describe physics. With general relativity, Einstein further proposed that we give up the idea that a "flat" (Euclidean) coordinate systems is superior to the others to describe physics. With the Theory of Incomplete Measurements, I propose that we more generally give up the idea that there are physical measurements that are superior to others.
By focusing on what a measurement is, the TIM reconstructs both quantum mechanics and general relativity, which become two approximations in the theory. The TIM explains a few long-standing "mysteries" such as why the wave-function is complex-valued and represents probabilities of presence, or what its collapse actually means. The TIM includes a new kind of math to describe physics without making any measurement, not even of space or time. It demonstrates that space-time is discrete, that different measurements may give space-time different topologies (e.g. the old and new definition of the metre, based on metal rods or lasers, curve differently in the presence of matter), and that what we call "space-time" really describes properties of the interaction of photons with matter.
As far as I know, the TIM is not considered a serious theory of physics by anybody today, and has not generated nearly as much interest as I hoped. I remain confident, however, that it is the best approach we have today for unifying general relativity and quantum mechanics. Because compared to alternatives such as quantum gravity or string theory, the core ideas are all so simple and intuitive I believe they just can't be wrong.
I've dabbled with various forms of art. I'm not a real artist anymore than I am a real physicist, but that won't prevent me from drawing little cartoons...
There once was a time when cell phone ring tones had to be monophonic. Two notes were one too many. So I tried to create a monophonic ring tone that would not be too boring. I still use it to that day. I named it Trop de Notes (Too many notes) in reference to this wonderful scene in Amadeus. I also thought of a passage in The Black Cloud where the Cloud asks for some music to be played back too fast.
I created a few other pieces of music which are also mostly crap. Oh well, I'm better at computers than music. But at least, my ringtone is my own ;-)
3DIVART is my attempt to adapt ray-tracing and ray-marching shaders from locations such as ShaderToy for use on auto-stereoscopic screens using Tao3D. As a result, I get 3D interactive visual art, i.e. something that you see in 3D, but that is perfectly synthetic.
Most of the 3DIVART pieces I created that way are obviously derived from work on ShaderToy. But I also started creating a few pieces of my own.
Initially, I had a single blog, much like everyone else. Over time, I figured it was easier for my readers to have multiple blogs with a specific topic for each.
I started the Grenouille Bouillie blog ("Boiled Frog" in French) in 2003, after returning from California. I wanted to describe this peculiar syndrom of people who lived in multiple countries. They no longer take for granted what they used to accept. For example, French people take it for granted that service in a supermarket has to be terrible, while many US inhabitants take it for granted that cheese has to be processed.
Over time, the blog evolved to cover all these things that we come to accept little by little, like software bugs or stupid politicians.
After a move to Wordpress, the earlier edition on Blogspot was abandonned for a while. I later revived it to focus specifically on a satire of French politics, often in French only. You may find it funny. Or not (especially if you are not French and don't know anything about the jokers I keep talking about). And sometimes I'm a bit more serious on things I feel are important.
I was one of the founders of Taodyne in 2011. I am still the most frequent author on the company blog. I write tutorials and use cases for Tao3D, as well as general news about the company and, occasionally, the industry. Unfortunately, this blog was lost when Gandi pulled the plug on their US data center (I was too much under water at the time to take the couple of days it would have taken to migrate the data).
There is also an older blog on Blogspot which is no longer maintained.
I joined Red Hat in January 2017. It's good to finally be able to do things I love in the open and get paid for it. My focus is primarily on virtualization and 3D, so two of my primary centers of interest converge. I'm part of the Spice team, since remote visualization is essential to the virtualization offering of Red Hat. Although at the moment I still know very little about Spice, to be honest.
I started maintaing a daily blog of things I want to remember. Since that blog is really for myself, I don't care if things are hard to understand or follow my own internal logic. Don't expect literature there. It's really more like a kind of public lab book.
In case you wonder, Red Skin Cat in french translates as "Chat Peau Rouge", which pronounces the same as french "Chapeau Rouge", or "Red Hat".