The people at itemis provide a vast range of skills beyond model-driven engineering. For me, this versatility yielded an iPhone project in cooperation with Weischer Mobile and phi mobile media where a funny marketing app had to be delivered. The design concept presented to Burlington drafted a line (water), a circle (duck) inside a rectangle (iPhone) and an arrow (movement). Some bullet points completed the requirements of the part I was asked to implement. Even though the final app offers some more features such as a movie, funny photo tricks and wallpapers, I was responsible for the ducky only. This is what I came up with:
Under the hood, the animated ducky as well as the illusion of water is based on a hand-crafted physics engine. The water is implemented as a particle-based fluid simulation. Matthias Müller published a paper with the same title and great slides for SIGGRAPH 2007 as a starting point if you are interested in this topic. The ducky itself interacts as rigid body with these particles where buoyancy had been implemented explicitly. Play with Erik Neumann’s demo to have fun with rigid body physics. These concepts had been adjusted and combined with the accelerometer and touch sensors built into the iPhone to let the user interact with the simulation.
Having just dots and circles as one could look at the physics engine was obviously not the ultimate goal. From the raw data of the simulation the water surface area had to be derived, the ducky had to be put in shape and some smooth animations and sounds were needed to round out the illusion of an interactive bathtub.
Without going into every detail I want to emphasize that the iPhone is not a MacBook Pro. Where you traditionally use marching cubes/squares to convert distinct particles into a cohesive area the processing power of mobile devices requires you to squeeze out every cycle by thinking outside the box and taking advantage of hardware acceleration. In this case, OpenGL ES offers a variety of techniques including framebuffers, blending functions and alpha tests to perform the needed steps by the GPU.
Some other findings during the project include the unreliability of sensor data as well as users’ unpredictable behavior. Be aware that humans and machines act differently under certain circumstances. For applications where the interaction between those both is crucial you should do usability tests early, often and extensively before delivery, again.
The iPhone is different from other mobile devices or the emulator. Generally, floating point arithmetic will be evaluated more efficient than fixed-point. Some GPU operations are executed faster on the device than on the emulator, others are of poor performance though. Therefore, you should profile on the device to test different approaches of your design regularly. And: You must not forget to do so with sound enabled since sound processing might take more than 30% of your overall processing power.
So, go ahead and grab your version of the Burlington Duck on the app store:
If you are interested in details (e.g. “unpredictable human factor”, production of the screencast, etc.), please let me know and use the comment function of this post.