I’ve been playing with the WebSocket feature introduced in ColdFusion 10
for some time now. I was trying out pushing images over a ColdFusion
WebSocket channel and it worked just fine. But this time I wanted to put
WebSockets to test and wanted to push large data at regular intervals. I
thought maybe I can push video data over WebSockets and it turned out
that there is no direct way to stream video data to many clients. I came
across the function - drawImage that can be used to draw an
Image or Video on a HTML5 Canvas. Once an image is drawn on the Canvas,
it’s base64 encoded data can be obtained by calling the toDataURL function
on the Canvas object. This data can then be transferred over a
ColdFusion WebSocket to all subscribers who can then use this data to
draw the image(video frame) on a Canvas.
Here’s the demo video:
Please note, I’m not transferring the audio track present in the Video and I’m still trying to figure how that can be achieved.
Here’s the Publisher code:
As you can see from the above code, once you start playing the video the
draw function is called. Here I've drawn the video on a Canvas using
the drawImage function and then used the function toDataURL
to get the base64 encoded data of the image. This is then transferred
over a ColdFusion WebSocket channel (‘myChannel’). I’m calling this
function (‘draw’) every 50ms to draw the current video frame on the
canvas (to achieve 20fps) and transfer the image over a WebSocket.
The client\subscriber on receiving the data, draws the image (video frame) on a canvas. Here’s the subscriber code:
On the client side, once the data is received over the WebSocket it is assigned to the source of an Image object. The reason why I do this is because the drawImage function takes either an Image or a Video as it's first argument and doesn't allow base64 data. Once the Image is loaded, it is ready to be drawn on the canvas. This process continues until the video ends or the user pauses the video.