picking apps in the iPhone-tree

I am getting into iOS-programming. Again i find myself interested in the ‘pattern language’ of the field and computation through all possible combinations. I made the following graph of relevant nodes for the iPhone (no advertising intended, it’s just a good example) to illustrate an approach. The graph is by no means complete and properly researched, nor is the categorization scheme one that i would claim to be anything but sketched out. It’s detailed in some branches more than in others.

Now let’s look at the subsets of nodes that particular apps are using…

The Sleep Talk Recorder app makes the iPhone stay in [active standby] until the [microphone] detects [sound] that is over a certain [threshold] – then it [records] until the [threshold] is underrun again.

-> function: collect audio-active snippets across time-spans

-> concrete application: collect recordings of someones sleep-talks

-> effect: creates a strangely intimate moment when listening to ones own subconscious utterances.

The Bug Spray – Ultrasonic app uses the [loudspeaker] to emit [high frequency tones] that have the effect of repelling insects (above hearing range of most people).

-> Utilizes the highest range of possible frequencies that the built-in loudspeaker can create for practical uses.

The Sonified app uses [brightness] and [colors] from the [visual input] of the [camera] and [processes] that into a (particular kind of) [audio output] – the [visual output] is the same as the [visual input].

-> Mimics the experience of people with Synesthesia who’s brain translates visual into audio-stimuli.

The Toneprint app lets the user [access a database] of TonePrints (each a package of parameters of guitar effects that can for instance mimic the favorite setting of a famous rockstar). The selected one get’s send to the pedal (the piece of hardware that takes the incoming guitar-signals and passes them on to the loudspeaker/amplifier after processing) by holding the iPhone to the guitar-sites and playing a unique-sounding magnetic impulse from the [loudspeaker]. The pedal receives this encoded impulse and decodes it into parameters of guitar effects.

-> Making use of the magnetic impulses that every loudspeaker creates as a “side-effect” of creating sounds.

Now that i have displayed some apps as subsets of nodes in “the tree of all-things-iPhone” – imagine the amount of ALL possible subsets that could be made with this (not mine above, but a comprehensive and correctly structured one) tree… it’s 2^n with n being the amount of nodes = a lot! Now find a way that could automatically neglect the impossible/nonsense-ones and look at the remaining branches… what you have then are ALL possible app-skeletons that any creative mind could possibly come up with, given the current (!) state of hard- and software, no? I think so.

I only used examples here that make somewhat use of the “physical features” of the iPhone. The same “mapping out and describing apps as subsets” could be done for less physical-feature- and more data-heavy apps with a categorization scheme that embraces all nuances of software/database/processing things…

This is in the thinking-line of A, B and C here.

Well, the next obvious thing to do for me is to actually show some of those possibly relevant branches (=app-skeletons)…

It’s important to notice that the search for new apps is an efficiency-problem. On a given state of available hard- and software-modules it’s only a matter of time that all possible branches are being explored. Only making the pie larger can then open space for new combinations of modules. This seems to be a crucial distinction of ‘creativity’ in general; the efficiency-creativity on one hand that computes through all possible branches of a given tree, and the pie-extending-creativity on the other hand that adds new nodes to the tree. But then again, there is also the creativity in how to use an app besides what it is intended for! For instance i could lay my iPhone next to a pond and run the ‘Sleep Talk Recorder’ for a day. What i would get is a dense collage of animal sounds that is otherwise spread over a whole day.


Published by

Benjamin Aaron Degenhart

Currently pursuing a Masters in Computational Science and Engineering at TU Munich.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.