WebM is in essence a container format for carrying VP8 or VP9 video and Vorbis or Opus audio. It does not specify how it should be streamed, and generally, it does not have wide support for streaming, compared to other container formats like MPEG-2 TS or MP4. It can however still be used for streaming.
Note that by "streaming", I do not mean just downloading a single file from a website, or embedding a single (long) clip in an HTML5 <video>
tag. Over the last years, several more advanced streaming technologies have been used:
RTMP streaming (Real Time Messaging Protocol), requires an RTMP server like Adobe Flash Media Server, which will stream Flash-supported file formats (MP4, FLV) to the client. This is still quite widespread, but slowly and surely dying out – like all Flash-based technologies.
As WebM is not supported in Flash, you cannot use it there.
RTSP streaming (Real Time Streaming Protocol) is a control protocol for streaming servers such as the QuickTime Streaming Server or Helix Server. The client and server will exchange control messages through this protocol, while the data is transmitted through RTP (Real-time Transport Protocol) payload. This is rarely found on the Web, but rather IPTV.
There are specifications on how to encapsulate WebM in RTP.
HTTP Live Streaming and MPEG-DASH are adaptive streaming technologies in which the client requests chunks of a video from a server through simple HTTP requests, based on an M3U8 playlist file (in the case of HLS) or an MPD Manifest (in the case of DASH). This file indexes these file chunks containing the actual audio and video data.
In HLS the video must be stored in MPEG-2 TS or, since 2017, fragmented MP4 (ISO base media format) files. MPEG-DASH has broader support; here, segmented WebM can also be used.