Amid yesterday's slew of education-focused announcements, Apple quietly slipped out an update to the free GarageBand for iOS app. Amongst other new features, it now allows owners of the iPhone X to use the TrueDepth camera to control effect and synth parameters using facial movements, completely hands-free.
It's the same tech that enables the Animoji face-tracking to work, and while it's a relatively niche development, it does hint at the possibilities of what might come in the future with regard to live performance and how we interact with iOS devices. At some stage, this tech will come to a wider range of iPhones and the iPad, at which point it will likely be more advanced than it is now.
Here's the list of new features:
- New downloadable “Toy Box” sound pack with free educational sound effects, including animals, vehicles, and counting to ten in different languages
- Use facial expressions and the TrueDepth camera for hands-free control over musical effects like guitar wah and synth parameters*
- Adds Modern Wah guitar stompbox effect
- Provides stability improvements and bug fixes
* Requires iPhone X. GarageBand uses ARKit face tracking features to translate your facial expressions into instrument effect controls. Your face information is processed on device, and only music is captured during your performance.
What do you think of this, are you excited to be able to "face the music"? Have you tried it out yet? Let us know!
Discussion
Want to join the discussion?
Create an account or login to get started!