[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [SIGMusic] Android on Tacchi

Ryan, sorry I forgot to mention.  We've been looking into Android-x86.  Thats the port we're using to write the TUIO module.  We discussed it for a while on this thread http://www.acm.uiuc.edu/archives/sigmusic-l/msg00471.html.

As for the project discussion,
Both of those examples are simple and interesting.  The reason for Android is not for the web browser, but the point is its a fully featured operating system designed to be used without a keyboard.  Its already being adopted for larger surfaces than the phone, so the table is a natural extension.  It has a fair amount of libraries already built in that are incredibly useful, however it was made for a phone in mind, so changes to it aren't terribly easy, but its FOSS.

If you have any interest in those applications, we can nail down an SDK for the table as soon as possible, and we can all start writing stuff.  The experimental collaborative apps like those you posted are really cool and important, however we can't neglect the appeal of a multitouch DJ app.

On Fri, Sep 10, 2010 at 9:14 PM, David Hollander <dhllndr@xxxxxxxxx> wrote:
Android seems ideal if you want other tablet functionality such as Internet Browsing, however I am assume you would not be able to run Ableton or Traktor and developing the interface as a plugin for either.

However, that is not necessarily a bad thing at all. Because emulating a DJ mixer would not be a very collaborative experience. If someone examing the display plays with the interface while someone is mixing, it's more likely that result would end up sounding worse as the turn off a track or the mix drops out etc. DJ interface would not necessariyl take advantage of the collaborative potential of such a large flat screen... perhaps a massive Tenorion would be more collaborative?

Here are some fun examples of this idea you can play around with:

All are MIT license I believe.

Since the taachi is fairly widescreen in ratio, maybe it would display 2-3 square music grids or instruments at a time. then at the bottom or top there would be a small ribbon where you could move your hand to the left or right to pan to new instruments (colored grid squares). So you could collaborate on different isntrument tracks and edit at least 2 instruments simultaneously.

Another option could be to start with the Plasma-pong-esque graphics you built as a base. When someone touches the screen it creates fluid ripples corresponding to a sine oscillator sound wave form that create new sounds. When someone else touches the screen nearby and the ripples collide, the sine oscillators also combine and you can create different waveforms. For example a saw wave depending how many people are touching the screen at what distances apart.

Just some ideas.


On Thu, Sep 9, 2010 at 11:44 PM, Ryan Teel <teelrc@xxxxxxxxx> wrote:
I did some looking around and found an open source android for x86 project.
I took a look at their multitouch compatability, and it seems like they already have a layer of abstraction set up.  It looks like would just need to write a plugin for the way we do touch inputs. It's definitely worth looking into.


SIGMusic-l mailing list

SIGMusic-l mailing list