The Cloud

I have a sticker on my laptop that's says "There is no cloud, it's just someone else's computer".

This is true, and I think it's indicative of how "the cloud" has for the most part failed to live up to its potential.

To me, what makes the cloud something other than a marketing term for existing data centers is dynanacism.  The ability to transparently scale capacity to the appropriate size for the workload.

There are services that offer this but most of them require manual intervention to "scale" the service, and even then the units are typically very simple and large-grained like adding another computer or perhaps a processor.  Most don't account for resources that relate to the software they are running in a direct way, and lack the instrumentation to make it clear that sizing should take place or to what degree.

One explanation for this is that it is advantageous for the service provider to encourage premature scaling, as this increases revenue.  Conversely, it is not advantageous to encourage a reduction in scale, for the same reason.

Aside from the non-dynamic nature of cloud resources within a particular service, there is essentially no heterogeneity between vendors, which means even if there were transparent, dynamic scaling it would be limited to the range of products from a given company.  This is a step backward from the pre-cloud era as when The Cloud is looked at as "just the Internet", there is a baseline platform on which services can be built that are portable, specifically TCP/IP.

For The Cloud to supersede the utility of "just the Internet" there needs to be a standard or set of protocols in place that allow applications to scale both vertically within the service of a particular vendor, as well as across vendors, in a dynamic way that requires no additional cognitive load on developers.  The existence of "devops" (really another word for sysadmin) is evidence that we are not there yet.

This may justify the creation of new programming tools capable of modeling software whose structure and syntax make the programmer oblivious to the components and divisions of the underlying architecture.  This is nothing new, and tools like these have been somewhat commonplace in high-performance and scientific computing systems for decades.  In several ways HPC systems have the same needs that The Cloud is trying to fulfill at Internet scale.

If the problem is addressed at this level of abstraction, additional technologies that are traditionally considered niche or specialist areas could be made available through The Cloud and be put to work seamlessly to power existing software with no changes.  Examples might include small-grained distributed processing across clusters of low-power nodes (i.e. a dynamic mesh of idle mobile phones) or highly specialized processing units constructed for on-demand jobs (GPU, FPGA, etc.).

The foundation of this is less about technology and more of a change in philosophy, and perhaps economics.  Instead of designing The Cloud as individual products from separate companies, what is needed is the development of open standards that are sufficiently valuable to consumers (programmers, product managers, etc) that supporting them become a a prerequisite for any serious cloud resource vendor.  The design and development of these tools must not favor any existing company, and must not be developed with any obligation that would create a similar conflict of interest.

The design and implementation of such a system will be the subject of future posts, but as always I'm happy to discuss anytime.

Embedding Media

Just a simple test to see if I can figure out how to embed an image inside a post

image.jpeg

Hey! Look at me!

Tablet Test

Testing to see how practical it would be to write a longer post using a tablet (in this case, an old iPad mini).

It's not bad, better than I expected, but my typing speed is severely limited (this is a function of the iPad more so than the website).

Control-wise it seems to work well but something more like a full-screen editing mode (no sidebar controls, etc.) might be an improvement.

In any event, this seems usable in a pinch which is much more than I can say for most other platforms.

The Story of Preposter.us

Preposter.us was born out of pain, spite and anger.

The pain was the loss of a useful tool to the mindless eating machine of "startup culture". The precursor event was the purchase of Posterous by Twitter in 2012. By 2013 they were shutting down, leaving everyone who had invested their time and content in the platform out on the street.

I won't spend too much time trying to explain why Posterous was great. I'll just say that for a large number of us, it was an amazingly low-friction way to get your writing out into the world, and keep an audience engaged across many platforms.

I looked long and hard for a replacement, but nothing came close. So one December night in 2013 I decided to create one, in the simplest form imaginable. The result was Preposter.us, and the initial version was 63 lines of Python.

My only ambition for Preposter.us was for my own personal use, and I continued to refine it with that in mind. At one point it became "good enough" for me and I switched from developer to user, slowing fixing bugs and adding features only after I had a genuine need as a user to do so.

During this time I assumed that I was the only person who thought publishing a blog via email made sense and didn't make an effort to turn anyone else on to the project. However I had a number conversations that made me think there might be a wider audience, so I spent a little time cleaning up the code and setting up a somewhat public instance of the server.

For whatever reason this didn't really take off, but I did get a little traction in the form of interest from the small/embedded computer crowd as the hardware requirements for a Preposter.us server are miniscule.

I continued to refine Preposter.us and setup a public server hosting the blogs under a series of domains, settling on Preposter.us. A talented designer even lent a hand to designing a logo/icon and I was content to simply operate the site with a "wait and see" attitude.

I didn't have any great aspirations for Preposter.us. I was of the 1000 True Fans mindset, and imagined a scenario where a simple subscription model could cover hosting and further development costs at a modest scale. Perhaps this could have happened if I decided to make it happen, but I didn't want to force or trick anyone into using Preposter.us, I wanted people who got it to use it, and I wanted to keep working on it to make those people happy.

Another year went by and I considered abandoning the project, but instead I made one more claim that I was going to develop the project as a product and find the audience. This enthusiasm didn't last long and little came of it. Furthermore changes to the mail clients I use introduced new, hard-to-fix bugs that prevented even me from using Preposter.us on a regular basis. This, coupled with dwindling interest and excitement from anyone other than myself left me without the gumption to continue on.

Since then I've relocated the site, and made a couple of half-hearted attempts to re-use the code for other efforts, but in the end I realized that developing Preposter.us was killing the exact reason I wanted to have it: I wanted the lowest-friction method possible for publishing my writing and reaching my audience.

Perhaps there is an audience for Preposter.us, but at this point I'm not excited by the idea of hunting them down. I don't plan to retire the server yet (it's essentially zero-cost and zero-maintenance in it's current form) but unless something changes I don't plan to spend much more time on the codebase.

I have a few ideas for audiences that would be well-served by Preposter.us. In particular, there are a lot of people around the world who don't have reliable Internet access or modern computers/smartphones, but they do have basic email and simple web browsers on inexpensive "feature phones". Preposter.us could be a great way to publish and consume content on these devices, and coupled with something like an Outernet receiver, could enable remote/disconnected communities to have locally-published content distributed to their community.

If I were to pursue this, I'd probably re-write the entire codebase to avoid some of the nasty encoding/decoding problems that have plagued Preposer.us since the beginning. For now, I plan to leave everything on autopilot until someone comes along with enough excitement about the project to spur me back into action.

Project Oberon

I'm somewhat obsessed with Project Oberon. This description of the project explains why:

"Project Oberon is a design for a complete desktop computer system from scratch. Its simplicity and clarity enables a single person to know and implement the entire system, while still providing enough power to make it useful and usable in a production environment. "

I grew up using computers that I could understand, all the way down to the metal. These computers came with block-diagrams of the hardware (if not literal schematics). "Modern" computers are considered too complex to provide users with this kind of information, but Project Oberon proclaims that they don't have to be that way.

In it's current form Project Oberon goes one step further than the 8-bit micros I cut my teeth on, providing even lower level access to the computer's design. This is only possible due to the advent and availability of FPGA hardware at consumer-level prices. You could part-together a set of Project Oberon hardware today for less than the cost of a Commodore VIC=20 back in the '80's (and that's in unadjusted dollars).

I still haven't pulled the trigger on the hardware myself, but I get closer every day. I've read all the documentation at least once, but I want to get a little more proficient with the emulator before plunking down cash for the hardware. I'm also still aiming to build-out a laptop version of the computer, and there's a few struggles to overcome due to the VGA video and PS/2 I/O of the board.