The problem
For an existing mobile app the customer had a requirement to include augmented reality features.
The solution
Introduce AR experiences in combination with current object detection efforts to provide educational value.
the result
Two separate augmented reality experiences were built internally. One depicts the process of pollination in AR on a flower that has been detected using a plant detection machine learning model. The second one depicts the process of photosynthesis on a leaf that has been detected.
Tech stack
Swift
iOS
CoreML
ARKit
Adobe After Effects
Combine realistic augmented reality experiences with an object detection machine learning model.
For the pollination simulation we implemented a 3D model of a bee that lands on a flower, collects flower nectar and commences the process of pollination by spreading the pollen when it flies away. The trajectory of the bee landing on the flower and then taking off is compliant with the flower that has been detected and it’s predicted center is used as the landing/take off location.
For the photosynthesis simulation we built custom particles that portray sunlight, oxygen and carbon dioxide molecules. The sequence of the simulation follows the real process that occurs in nature.
For both simulations depth is taken into account, depending on the distance to the detected flower or leaf.