stream
¶
fMP4 stream encoder used by the streaming server.
The client's Media Source Extensions player needs a continuous fMP4
byte stream: first an initialization segment (ftyp + moov),
then one or more media segments (moof + mdat). We pipe raw
RGB frames into an ffmpeg subprocess configured for fragmented output
via -movflags empty_moov+default_base_moof+frag_keyframe+faststart
and stream the bytes back out.
Classes¶
fastvideo.entrypoints.streaming.stream.FragmentedMP4Chunk
dataclass
¶
A single fMP4 byte chunk emitted by :class:FragmentedMP4Encoder.
kind identifies whether the chunk is the init segment (must be
fed into the client's SourceBuffer first) or a media fragment.
fastvideo.entrypoints.streaming.stream.FragmentedMP4Encoder
¶
FragmentedMP4Encoder(*, width: int, height: int, fps: int, segment_idx: int, stream_id: str | None = None, ffmpeg_path: str = 'ffmpeg', preset: str = 'ultrafast', pixel_format_out: str = 'yuv420p', extra_args: list[str] | None = None)
Stream RGB frames in, fMP4 chunks out.
One encoder covers one segment. The server creates a new encoder per :class:`ltx2_segment_start`` boundary so each segment becomes one media fragment the client can append independently.
Example::
encoder = FragmentedMP4Encoder(width=1024, height=576, fps=24,
segment_idx=0)
async with encoder:
async for chunk in encoder.encode(frames):
await websocket.send_bytes(chunk.data)
Source code in fastvideo/entrypoints/streaming/stream.py
Functions¶
fastvideo.entrypoints.streaming.stream.FragmentedMP4Encoder.encode
async
¶
encode(frames: list[ndarray] | AsyncIterator[ndarray]) -> AsyncIterator[FragmentedMP4Chunk]
Feed frames into ffmpeg and yield fMP4 chunks as they appear.