Tag Archives: multitouch

Visual programming on (and for) multitouch handhelds

I wonder if anyone who’s been following the iPhone, iPod touch, Neo1973 or any other multitouch handheld has considered the possibility of visual programming for these devices?

Since the devices, by nature, aren’t built to be the most adept typing machines (that goes for any mobile, since you can only use one of each finger at a time to type text on it), they are apparently not the best devices on which to create an application with a typed programming language (even the JavaScript that is used for any third-party webapps on the iPhone/iTouch). They are, however, *intensely* graphical, as the screens of these devices, which are used to display the information that is received or generated by the OS, tend to encompass the majority of the device’s front-end.

So if the combination of a graphical interface with an all-fingers interaction method is our only way to make sufficient use of these devices, then what about the applications which could be installed on these devices (the jailbreaking of Apple’s devices is another story)? At the moment, most applications for mobile devices, multitouch or no, are created trough the use of the keyboards for desktop computers, and are created in a variety of programming or scripting languages (sometimes with interfaces which make use of markup, stylesheets and vector graphics) which have to be typed gratuitously and fluidly.

But what if one doesn’t have a desktop or laptop device available, but has a multitouch handheld to, well, handle? What if the user feels like creating an application that isn’t already available on the device, like a plugin for a built-in audio player interface?

I think that, in this case, an on-device visual programming environment, one that is made for multitouch interaction, may be the best solution for creating applications on the device.

Since visual programming is, from my supposition, less driven by the keyboard and more driven by the mouse on a desktop computer (drag n’ drop and all that), then such a programming environment form could possibly be easily driven by fingers as well. This would make it extremely easy for the users of these devices to create installable programs and applications from within the devices rather than from without.

Now if only Firefox was ported to the iPhone so we could try this theory out.

Demo of visual programming, using Quartz Composer:

Idea: wireless “dumb terminal” touchscreen display for smartphones

OK, just thought about this last night:

I notice that most smartphones, including the multitouch fullscreen ones like the iPhone and Neo1973, are built as all-in-ones, with the hard drive and processing unit placed directly behind the display.

Then I saw the video of the most recent Macworld keynote by Steve Jobs last night, in which he introduced the MacBook Air. I had already read from Digg about how this new notebook computer, in order to accomplish one of the thinner laptop form factors, sacrificed such long-standing laptop components such as the optical disc (CD/DVD) drive, replacing it with an external optical disc drive that streams optical media data (even DVD-borne software titles such as Office ’08 for Mac or OS X 10.5) to the MacBook Air, where such data can then be copied or installed to the hard drive. This essentially rendered the MacBook Air, in Jobs’ words, a “wireless machine”; this may suit Apple’s modus operandi, which is directed more to digitally-transmitted data (iTunes) than to hard-based data on CDs, DVDs, or Blu-Ray/HD-DVD.

==============

Now, the following is becoming increasingly true:

The more mobile and free-wheeling the device, the smaller the form factor, the greater the network dependence, and the greater the need for interaction with the display.

The MacBook Air, the iPhone, and the iPod touch are all Apple’s mobile “wireless machines”. The software in these devices rely upon a wireless, streaming network in order for several applications to function correctly, and the interfaces use a greater immersion of the finger for various tasks, whether its widening a photo or dialing a phone number.

However, in the mobile industry (including Apple’s mobile products), most of the components (CPU, hard drive, etc.) are tied underneath the display (or, if its not multitouch, then both the screen and buttons). The same goes for other multitouch computer devices, such as the Microsoft Surface which is due for sale by at least 2009.

But what if the multitouch display could be used for the same communication purposes as your average smartphone without it being welded or tied to the computer that holds the hard drive and electronics?

==============

My idea is what I call “the commoditization of handhelds and other mobile computer devices”.

What this will allow is for someone to use one or more displays for a single, more stationary phone device, and for someone to cheaply upgrade their stationary phone device with greater hardware enhancements than would be feasible with the phone device tied directly to the display.

The phone device will “talk” wirelessly with the portable display, which doesn’t have a hard drive and very few electronic components except for those which allow for the display itself to function at all. The display will recognize any touch motions, which are then sent back to the stationary smartphone device where the operating system is stored. The smartphone is also wirelessly connected to the Internet though a wifi router.

This will be a more cost-effective approach to the creation of smartphones, as it will let both the computer and the portable multitouch display be true to the concerns within either components’ own domain.

This also allows for greater innovations within the display field that will eventually make it to the mobile market. This is one example: