Media & Animation Events
Handle video/audio playback state changes, image loading, CSS transitions, and animation lifecycle events.
Event Reference
Media Events (Video & Audio)
| Event | Fires When | Event Object |
|---|---|---|
onPlay | Playback starts or resumes | React.SyntheticEvent |
onPause | Playback is paused | React.SyntheticEvent |
onEnded | Playback reaches the end | React.SyntheticEvent |
onTimeUpdate | The current playback time changes (fires frequently) | React.SyntheticEvent |
onVolumeChange | Volume or mute state changes | React.SyntheticEvent |
onLoadedData | The first frame of media has loaded | React.SyntheticEvent |
onError | Media loading fails | React.SyntheticEvent |
onWaiting | Playback stalls due to buffering | React.SyntheticEvent |
onSeeking | A seek operation begins | React.SyntheticEvent |
onSeeked | A seek operation completes | React.SyntheticEvent |
Image Events
| Event | Fires When | Event Object |
|---|---|---|
onLoad | The image finishes loading | React.SyntheticEvent |
onError | The image fails to load | React.SyntheticEvent |
CSS Animation Events
| Event | Fires When | Event Object |
|---|---|---|
onAnimationStart | A CSS animation begins | React.AnimationEvent |
onAnimationEnd | A CSS animation completes | React.AnimationEvent |
onAnimationIteration | A CSS animation loop restarts | React.AnimationEvent |
CSS Transition Events
| Event | Fires When | Event Object |
|---|---|---|
onTransitionEnd | A CSS transition completes | React.TransitionEvent |
Recipe
Quick-reference recipe card -- copy-paste ready.
// Video play/pause
"use client";
import { useRef, useState } from "react";
export function VideoToggle({ src }: { src: string }) {
const videoRef = useRef<HTMLVideoElement>(null);
const [playing, setPlaying] = useState(false);
const toggle = () => {
if (videoRef.current?.paused) {
videoRef.current.play();
} else {
videoRef.current?.pause();
}
};
return (
<div>
<video
ref={videoRef}
src={src}
onPlay={() => setPlaying(true)}
onPause={() => setPlaying(false)}
/>
<button onClick={toggle}>{playing ? "Pause" : "Play"}</button>
</div>
);
}// Image onLoad / onError
"use client";
import { useState } from "react";
export function SafeImage({ src, alt }: { src: string; alt: string }) {
const [loaded, setLoaded] = useState(false);
const [error, setError] = useState(false);
if (error) return <div className="bg-gray-200 p-4">Failed to load image</div>;
return (
<img
src={src}
alt={alt}
onLoad={() => setLoaded(true)}
onError={() => setError(true)}
className={loaded ? "opacity-100" : "opacity-0"}
style={{ transition: "opacity 0.3s" }}
/>
);
}// Animation end callback
"use client";
import { useState } from "react";
export function FadeOutAndRemove() {
const [visible, setVisible] = useState(true);
const [animating, setAnimating] = useState(false);
const handleDismiss = () => setAnimating(true);
if (!visible) return null;
return (
<div
className={animating ? "animate-fadeOut" : ""}
onAnimationEnd={() => {
if (animating) setVisible(false);
}}
>
<p>I will fade away</p>
<button onClick={handleDismiss}>Dismiss</button>
</div>
);
}When to reach for this: You need to synchronize UI state with media playback, image loading lifecycle, or CSS animation/transition completion.
Working Example
// CustomVideoPlayer.tsx
"use client";
import { useRef, useState, useCallback } from "react";
export default function CustomVideoPlayer({ src }: { src: string }) {
const videoRef = useRef<HTMLVideoElement>(null);
const [playing, setPlaying] = useState(false);
const [currentTime, setCurrentTime] = useState(0);
const [duration, setDuration] = useState(0);
const [volume, setVolume] = useState(1);
const [muted, setMuted] = useState(false);
const [buffering, setBuffering] = useState(false);
const togglePlay = useCallback(() => {
const video = videoRef.current;
if (!video) return;
if (video.paused) {
video.play();
} else {
video.pause();
}
}, []);
const handleTimeUpdate = useCallback(
(e: React.SyntheticEvent<HTMLVideoElement>) => {
setCurrentTime(e.currentTarget.currentTime);
},
[]
);
const handleLoadedData = useCallback(
(e: React.SyntheticEvent<HTMLVideoElement>) => {
setDuration(e.currentTarget.duration);
},
[]
);
const handleVolumeChange = useCallback(
(e: React.SyntheticEvent<HTMLVideoElement>) => {
setVolume(e.currentTarget.volume);
setMuted(e.currentTarget.muted);
},
[]
);
const handleSeek = useCallback(
(e: React.ChangeEvent<HTMLInputElement>) => {
const video = videoRef.current;
if (!video) return;
video.currentTime = Number(e.target.value);
},
[]
);
const handleVolumeSlider = useCallback(
(e: React.ChangeEvent<HTMLInputElement>) => {
const video = videoRef.current;
if (!video) return;
video.volume = Number(e.target.value);
},
[]
);
const toggleMute = useCallback(() => {
const video = videoRef.current;
if (!video) return;
video.muted = !video.muted;
}, []);
const formatTime = (seconds: number) => {
const m = Math.floor(seconds / 60);
const s = Math.floor(seconds % 60);
return `${m}:${s.toString().padStart(2, "0")}`;
};
const progress = duration > 0 ? (currentTime / duration) * 100 : 0;
return (
<div className="max-w-2xl mx-auto bg-black rounded-xl overflow-hidden">
<div className="relative">
<video
ref={videoRef}
src={src}
className="w-full"
onPlay={() => setPlaying(true)}
onPause={() => setPlaying(false)}
onTimeUpdate={handleTimeUpdate}
onLoadedData={handleLoadedData}
onVolumeChange={handleVolumeChange}
onWaiting={() => setBuffering(true)}
onSeeked={() => setBuffering(false)}
onEnded={() => setPlaying(false)}
onClick={togglePlay}
/>
{buffering && (
<div className="absolute inset-0 flex items-center justify-center bg-black/30">
<div className="w-10 h-10 border-4 border-white border-t-transparent rounded-full animate-spin" />
</div>
)}
</div>
{/* Progress bar */}
<div className="px-4 pt-2">
<div className="relative h-1 bg-gray-700 rounded-full">
<div
className="absolute h-full bg-blue-500 rounded-full"
style={{ width: `${progress}%` }}
/>
</div>
<input
type="range"
min={0}
max={duration || 0}
step={0.1}
value={currentTime}
onChange={handleSeek}
className="w-full opacity-0 absolute cursor-pointer"
style={{ marginTop: "-6px", height: "12px" }}
aria-label="Seek"
/>
</div>
{/* Controls */}
<div className="flex items-center gap-4 p-4 text-white">
<button
onClick={togglePlay}
className="text-xl"
aria-label={playing ? "Pause" : "Play"}
>
{playing ? "⏸" : "▶"}
</button>
<span className="text-sm font-mono">
{formatTime(currentTime)} / {formatTime(duration)}
</span>
<div className="flex items-center gap-2 ml-auto">
<button onClick={toggleMute} aria-label={muted ? "Unmute" : "Mute"}>
{muted || volume === 0 ? "🔇" : "🔊"}
</button>
<input
type="range"
min={0}
max={1}
step={0.05}
value={muted ? 0 : volume}
onChange={handleVolumeSlider}
className="w-20"
aria-label="Volume"
/>
</div>
</div>
</div>
);
}What this demonstrates:
- Using
onPlay,onPause, andonEndedto track playback state - Using
onTimeUpdateto sync a progress bar with the current time - Using
onLoadedDatato capture the video duration - Using
onVolumeChangeto reflect volume and mute state changes - Using
onWaitingandonSeekedfor buffering indicator - Controlling the video element via
refwhile listening to events for state sync
Deep Dive
How It Works
- Media events are dispatched by the browser's native
<video>and<audio>elements. React wraps them asSyntheticEventinstances. onTimeUpdatefires approximately 4 times per second during playback (browser-dependent, not tied torequestAnimationFrame).- Image
onLoadfires after the image is fully decoded and ready to display.onErrorfires if the source URL fails. - CSS
onAnimationEndfires once per animation per element. If multiple animations run on the same element, it fires for each. onTransitionEndfires for each CSS property that transitions. Transitioningallproperties may produce multiple events.
Variations
Image lazy loading with fade-in on load:
"use client";
import { useState, useCallback } from "react";
export function LazyImage({
src,
alt,
width,
height,
}: {
src: string;
alt: string;
width: number;
height: number;
}) {
const [loaded, setLoaded] = useState(false);
const [error, setError] = useState(false);
const handleLoad = useCallback(() => setLoaded(true), []);
const handleError = useCallback(() => setError(true), []);
return (
<div
className="relative overflow-hidden bg-gray-100"
style={{ width, height }}
>
{error ? (
<div className="flex items-center justify-center w-full h-full text-gray-400">
Failed to load
</div>
) : (
<>
{!loaded && (
<div className="absolute inset-0 animate-pulse bg-gray-200" />
)}
<img
src={src}
alt={alt}
loading="lazy"
onLoad={handleLoad}
onError={handleError}
className={`w-full h-full object-cover transition-opacity duration-500 ${
loaded ? "opacity-100" : "opacity-0"
}`}
/>
</>
)}
</div>
);
}Animation chaining with onAnimationEnd:
"use client";
import { useState, useCallback } from "react";
type Phase = "idle" | "slideIn" | "pulse" | "slideOut" | "done";
export function AnimationChain() {
const [phase, setPhase] = useState<Phase>("idle");
const start = useCallback(() => setPhase("slideIn"), []);
const handleAnimationEnd = useCallback(
(e: React.AnimationEvent) => {
// Only handle animations on this specific element
if (e.target !== e.currentTarget) return;
const transitions: Record<string, Phase> = {
slideIn: "pulse",
pulse: "slideOut",
slideOut: "done",
};
const next = transitions[phase];
if (next) setPhase(next);
},
[phase]
);
if (phase === "done") {
return <button onClick={() => setPhase("idle")}>Reset</button>;
}
const animationClass: Record<Phase, string> = {
idle: "",
slideIn: "animate-slideInRight",
pulse: "animate-pulse",
slideOut: "animate-slideOutLeft",
done: "",
};
return (
<div>
{phase === "idle" ? (
<button onClick={start}>Start Animation Chain</button>
) : (
<div
className={`p-8 bg-blue-500 text-white rounded-lg ${animationClass[phase]}`}
onAnimationEnd={handleAnimationEnd}
>
Phase: {phase}
</div>
)}
</div>
);
}Transition completion callback:
"use client";
import { useState, useCallback } from "react";
export function CollapsePanel({
children,
title,
}: {
children: React.ReactNode;
title: string;
}) {
const [open, setOpen] = useState(false);
const [rendered, setRendered] = useState(false);
const toggle = useCallback(() => {
if (!open) {
setRendered(true); // mount content before expanding
requestAnimationFrame(() => setOpen(true));
} else {
setOpen(false);
}
}, [open]);
const handleTransitionEnd = useCallback(
(e: React.TransitionEvent) => {
// Only respond to height transition, not other properties
if (e.propertyName === "max-height" && !open) {
setRendered(false); // unmount after collapse completes
}
},
[open]
);
return (
<div className="border rounded-lg">
<button onClick={toggle} className="w-full p-4 text-left font-semibold">
{title} {open ? "▲" : "▼"}
</button>
<div
className="overflow-hidden transition-[max-height] duration-300 ease-in-out"
style={{ maxHeight: open ? "500px" : "0px" }}
onTransitionEnd={handleTransitionEnd}
>
{rendered && <div className="p-4 border-t">{children}</div>}
</div>
</div>
);
}Audio visualization basics:
"use client";
import { useRef, useCallback, useEffect, useState } from "react";
export function AudioVisualizer({ src }: { src: string }) {
const audioRef = useRef<HTMLAudioElement>(null);
const canvasRef = useRef<HTMLCanvasElement>(null);
const analyserRef = useRef<AnalyserNode | null>(null);
const [initialized, setInitialized] = useState(false);
const initAudio = useCallback(() => {
if (initialized || !audioRef.current) return;
const ctx = new AudioContext();
const source = ctx.createMediaElementSource(audioRef.current);
const analyser = ctx.createAnalyser();
analyser.fftSize = 256;
source.connect(analyser);
analyser.connect(ctx.destination);
analyserRef.current = analyser;
setInitialized(true);
}, [initialized]);
useEffect(() => {
if (!initialized) return;
const analyser = analyserRef.current;
const canvas = canvasRef.current;
if (!analyser || !canvas) return;
const ctx = canvas.getContext("2d");
if (!ctx) return;
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
let animationId: number;
const draw = () => {
animationId = requestAnimationFrame(draw);
analyser.getByteFrequencyData(dataArray);
ctx.fillStyle = "#000";
ctx.fillRect(0, 0, canvas.width, canvas.height);
const barWidth = (canvas.width / bufferLength) * 2.5;
let x = 0;
for (let i = 0; i < bufferLength; i++) {
const barHeight = (dataArray[i] / 255) * canvas.height;
ctx.fillStyle = `hsl(${(i / bufferLength) * 360}, 80%, 50%)`;
ctx.fillRect(x, canvas.height - barHeight, barWidth, barHeight);
x += barWidth + 1;
}
};
draw();
return () => cancelAnimationFrame(animationId);
}, [initialized]);
return (
<div className="space-y-4">
<canvas ref={canvasRef} width={600} height={200} className="bg-black rounded-lg" />
<audio
ref={audioRef}
src={src}
controls
onPlay={initAudio}
crossOrigin="anonymous"
/>
</div>
);
}Video progress bar with preview:
"use client";
import { useRef, useState, useCallback } from "react";
export function VideoProgressBar({
videoRef,
duration,
currentTime,
}: {
videoRef: React.RefObject<HTMLVideoElement | null>;
duration: number;
currentTime: number;
}) {
const barRef = useRef<HTMLDivElement>(null);
const [hoverTime, setHoverTime] = useState<number | null>(null);
const handleClick = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
const rect = e.currentTarget.getBoundingClientRect();
const fraction = (e.clientX - rect.left) / rect.width;
if (videoRef.current) {
videoRef.current.currentTime = fraction * duration;
}
},
[videoRef, duration]
);
const handleMouseMove = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
const rect = e.currentTarget.getBoundingClientRect();
const fraction = (e.clientX - rect.left) / rect.width;
setHoverTime(fraction * duration);
},
[duration]
);
const progress = duration > 0 ? (currentTime / duration) * 100 : 0;
const formatTime = (s: number) =>
`${Math.floor(s / 60)}:${Math.floor(s % 60)
.toString()
.padStart(2, "0")}`;
return (
<div
ref={barRef}
className="relative h-2 bg-gray-700 rounded cursor-pointer group"
onClick={handleClick}
onMouseMove={handleMouseMove}
onMouseLeave={() => setHoverTime(null)}
>
<div
className="h-full bg-blue-500 rounded"
style={{ width: `${progress}%` }}
/>
{hoverTime !== null && (
<div
className="absolute -top-8 bg-black text-white text-xs px-2 py-1 rounded -translate-x-1/2"
style={{ left: `${(hoverTime / duration) * 100}%` }}
>
{formatTime(hoverTime)}
</div>
)}
</div>
);
}TypeScript Notes
// Media event -- typed to the specific element
function handlePlay(e: React.SyntheticEvent<HTMLVideoElement>) {
const video: HTMLVideoElement = e.currentTarget;
const time: number = video.currentTime;
const dur: number = video.duration;
const paused: boolean = video.paused;
const vol: number = video.volume;
}
// Audio element events use the same type
function handleAudioEnd(e: React.SyntheticEvent<HTMLAudioElement>) {
const audio: HTMLAudioElement = e.currentTarget;
console.log("Audio ended at", audio.duration);
}
// Image load/error
function handleImgLoad(e: React.SyntheticEvent<HTMLImageElement>) {
const img: HTMLImageElement = e.currentTarget;
const naturalWidth: number = img.naturalWidth;
const naturalHeight: number = img.naturalHeight;
}
// Animation events
function handleAnimEnd(e: React.AnimationEvent<HTMLDivElement>) {
const name: string = e.animationName; // CSS animation name
const elapsed: number = e.elapsedTime; // seconds
const pseudo: string = e.pseudoElement; // "::before", "::after", or ""
}
// Transition events
function handleTransEnd(e: React.TransitionEvent<HTMLDivElement>) {
const property: string = e.propertyName; // e.g., "opacity", "transform"
const elapsed: number = e.elapsedTime; // seconds
const pseudo: string = e.pseudoElement;
}
// Ref typing for media elements
const videoRef = useRef<HTMLVideoElement>(null);
const audioRef = useRef<HTMLAudioElement>(null);Gotchas
-
onTransitionEndfires once per property -- Transitioningallor multiple properties (e.g.,opacityandtransform) fires the event multiple times. Fix: Checke.propertyNameto respond only to the property you care about:if (e.propertyName !== "opacity") return;. -
onAnimationEndbubbles from child elements -- If a child has its own CSS animation, the event bubbles up to your handler. Fix: Comparee.target === e.currentTargetto ensure you only respond to your element's animation. -
onTimeUpdatedoes not fire every frame -- It fires roughly 4 times per second, not 60. This makes it too coarse for frame-accurate UI. Fix: For smooth progress bars, userequestAnimationFramewithvideoRef.current.currentTimeinstead. -
Media
onErrorgives minimal information -- TheSyntheticEventdoes not include an error code directly. Fix: Accesse.currentTarget.error(aMediaErrorobject) withe.currentTarget.error?.codeande.currentTarget.error?.messagefor details. -
onLoadon<img>fires on cached images too -- But the timing is different: cached images may fireonLoadsynchronously during render. Fix: Always initialize your loading state tofalseand letonLoadset it totrue, rather than assuming async timing. -
CSS animations on unmounted elements do not fire
onAnimationEnd-- If you remove the element from the DOM before the animation finishes, the event never fires. Fix: Wait foronAnimationEndbefore setting the state that removes the element, or use the Web Animations API for promise-based control. -
play()returns a Promise that can reject -- CallingvideoRef.current.play()returns a Promise. If the user has not interacted with the page, autoplay policies reject it. Fix: Alwaysawaitor.catch()the play promise:videoRef.current.play().catch(() => { /* handle autoplay block */ }).
Alternatives
| Alternative | Use When | Don't Use When |
|---|---|---|
Web Animations API (element.animate()) | You need JS-driven animations with promises and playback control | CSS keyframes are sufficient |
Framer Motion (motion.div) | You want declarative React animations with enter/exit/layout | You only need to handle native media events |
requestAnimationFrame | You need frame-accurate progress tracking for video | onTimeUpdate precision is acceptable |
<video> with HLS.js / dash.js | You need adaptive streaming (ABR) | A single MP4 file works fine |
Next.js <Image> | You want automatic optimization, lazy loading, and blur placeholder | You need fine-grained onLoad / onError control with custom logic |
FAQs
How do I detect when a video is buffering?
- Use
onWaitingto detect when playback stalls due to insufficient data - Use
onSeekedoronCanPlayto detect when enough data is available to resume - Show a loading spinner between
onWaitingand the nextonSeeked/onCanPlay
Why does video.play() sometimes throw an error?
- Browser autoplay policies require user interaction before media can play with sound
play()returns a Promise that rejects if autoplay is blocked- Always handle the rejection:
video.play().catch(() => { /* show play button */ }) - Videos with
mutedattribute are usually allowed to autoplay
How do I chain multiple CSS animations on the same element?
- Listen to
onAnimationEndand change the CSS class to trigger the next animation - Check
e.animationNameto know which animation just finished - Set state to apply the next animation class, creating a sequential chain
Why does onTransitionEnd fire multiple times?
- It fires once per CSS property that transitions
- If you transition
transformandopacity, you get two events - Filter by
e.propertyNameto respond only to the one you care about
How do I build a custom progress bar for a video player?
- Use
onLoadedDatato getvideo.duration - Use
onTimeUpdateto trackvideo.currentTime - Calculate
(currentTime / duration) * 100for the progress percentage - For smoother updates, use
requestAnimationFrameinstead ofonTimeUpdate
How do I handle image loading errors gracefully?
const [error, setError] = useState(false);
return error ? (
<div className="placeholder">Image unavailable</div>
) : (
<img
src={src}
alt={alt}
onError={() => setError(true)}
/>
);Can I detect when a CSS animation loops with onAnimationIteration?
- Yes,
onAnimationIterationfires at the end of each iteration for animations withanimation-iteration-countgreater than 1 - It does not fire at the end of the last iteration -- use
onAnimationEndfor that - The event includes
animationNameandelapsedTimeto identify which cycle completed
Related
- Pointer Events -- Unified input across mouse, touch, and pen
- Touch Events -- Multi-touch gesture handling for mobile
- Scroll Events -- Scroll position tracking and infinite scroll