







Campaign Summary
“Run is Chill” is a social-first campaign by Riot of Colors for Adidas, designed to reframe running as something laid-back, expressive, and personal. It’s not about pushing pace or performance — it’s about motion that matches your mood, whether you’re catching a sunrise jog or taking a rainy stroll after work.
The campaign and merch features hand-drawn animations by Cactus Under Rainbow that shift depending on your real-time environment. The visuals adapt to time of day, weather, and even location, and are layered naturally over user-generated content. Think of a chill breeze in Ultraboosts or a sleepy cloud in a track jacket.
A mobile-first microsite creates a dynamic, tailored experience. Users can upload or record clips automatically styled and remixed with campaign visuals, featuring Adidas apparel integrated seamlessly into the animation.
In addition, the microsite generates an AI-powered Spotify playlist specific to your current location. Whether you’re in D.C., L.A., or Chicago, the system curates a clean, energizing soundtrack based on local sounds- classic and current — go-go in D.C., house in Chicago, or West Coast hip-hop in L.A. It’s a vibe engine that helps you run with the rhythm of the city you’re in.
Content lives and breathes across Instagram, TikTok, and Snapchat via reactive AR filters and camera overlays, using weather, time, and geo-data to generate unique visual layers and audio cues — turning every run into a shareable, living visual mixtape.
Platform + Tech Stack Overview
Front-End & Microsite
- React or Vue.js for a reactive UI
- Next.js (preferred for Vercel) or Nuxt.js (for SSR with Vue)
- Hosted via Vercel (for fast global deployment) or AWS Amplify (if using AWS stack)
- Tailwind CSS or styled-components for modular, fast styling
APIs & Environmental Inputs
- OpenWeatherMap API for weather-based animation triggers
- WorldTimeAPI or native JavaScript Date/time logic for day and night animations
- Geolocation API for location-aware visuals
- Optional: IPStack or Mapbox for more granular geo-location and city-based content
- Spotify API to dynamically generate mood & location-based playlists for users
Animation, Filters & AI Stylization
- Adobe Animate and After Effects to create hand-drawn visuals
- Lottie (via Bodymovin) for optimized vector animations
- PNG or WebM sprite sheets for AR and camera overlays
- RunwayML or Stable Diffusion API for AI-powered style filters (applied to UGC or webcam feed)
- TensorFlow.js for optional client-side style transfer or lightweight ML applications
Social Filters & AR Integration
- Spark AR for Instagram and Facebook
- Effect House for TikTok
- Lens Studio for Snapchat
- Filters use real-time weather, location, and time-based logic to trigger different animation overlays
- Assets exported as transparent animations, layered onto the user’s camera view
Backend, Storage & Processing
Firebase or Supabase for:
- Storing user-generated content
- Optional login/authentication (email or OAuth)
- Real-time syncing and database triggers
- Cloudinary or FFmpeg (via serverless functions) to process, overlay, and optimize video/image content
- Optional CDN delivery for fast content sharing across social
Future Enhancements
- Spotify Web Playback SDK to play campaign playlists directly within the microsite
- Web Speech API for lightweight, accessible voice-based controls
- WebGL or Three.js for ambient interactions (animated backgrounds, light shifts, motion trails)
- Adobe Aero or 8thWall for future event-based AR installations or gallery integrations