
Campaign Summary
“Run is Chill” is a social-first campaign by Riot of Colors for Adidas, designed to reframe running as something laid-back, expressive, and personal. It’s not about pushing pace or performance — it’s about motion that matches your mood, whether you’re catching a sunrise jog or taking a rainy stroll after work.
The campaign features hand-drawn animations by Cactus Under Rainbow that shift depending on your real-time environment. The visuals adapt to time of day, weather, and even location, and are layered naturally over user-generated content. Think of a chill breeze in Ultraboosts or a sleepy cloud in a track jacket.
Content is pushed across Instagram, TikTok, and Snapchat via reactive AR filters and camera overlays. A mobile-first microsite offers a more customized experience, letting users upload or record clips that get styled and remixed in campaign visuals. Adidas apparel is integrated throughout the animations, making it a seamless part of the story.
Platform + Tech Stack Overview
Front-End & Microsite
React or Vue.js for a reactive UI
Next.js (preferred for Vercel) or Nuxt.js (for SSR with Vue)
Hosted via Vercel (for fast global deployment) or AWS Amplify (if using AWS stack)
Tailwind CSS or styled-components for modular, fast styling
APIs & Environmental Inputs
OpenWeatherMap API for weather-based animation triggers
WorldTimeAPI or native JavaScript Date/time logic for day and night animations
Geolocation API for location-aware visuals
Optional: IPStack or Mapbox for more granular geo-location and city-based content
Spotify API to dynamically generate mood-based playlists for users
Animation, Filters & AI Stylization
Adobe Animate and After Effects to create hand-drawn visuals
Lottie (via Bodymovin) for optimized vector animations
PNG or WebM sprite sheets for AR and camera overlays
RunwayML or Stable Diffusion API for AI-powered style filters (applied to UGC or webcam feed)
TensorFlow.js for optional client-side style transfer or lightweight ML applications
Social Filters & AR Integration
Spark AR for Instagram and Facebook
Effect House for TikTok
Lens Studio for Snapchat
Filters use real-time weather, location, and time-based logic to trigger different animation overlays
Assets exported as transparent animations, layered onto the user’s camera view
Backend, Storage & Processing
Firebase or Supabase for:
Storing user-generated content
Optional login/authentication (email or OAuth)
Real-time syncing and database triggers
Cloudinary or FFmpeg (via serverless functions) to process, overlay, and optimize video/image content
Optional CDN delivery for fast content sharing across social
Optional Enhancements
Spotify Web Playback SDK to play campaign playlists directly within the microsite
Web Speech API for lightweight, accessible voice-based controls
WebGL or Three.js for ambient interactions (animated backgrounds, light shifts, motion trails)
Adobe Aero or 8thWall for future event-based AR installations or gallery integrations
