Advanced Multimedia Processing for WebApp on Tizen...Advanced Multimedia Processing for WebApp on...
Transcript of Advanced Multimedia Processing for WebApp on Tizen...Advanced Multimedia Processing for WebApp on...
- 1 - Advanced Multimedia Processing for WebApp on Tizen
Advanced Multimedia Processing for WebApp on Tizen
1. HTML5 Video Player - Open source Javascript Video Player Library: Vidso.js
2. Image Processing Library - Open source Javascript Image Processing Library: JSFeat - Open source Javascript Face Detection Library: clmtrackr 3. Audio Processing Library - Open source Javascript Web Audio Library: P5.Sound - Open source Javascript Sound Synthesis Library: WAD
- 2 - Advanced Multimedia Processing for WebApp on Tizen
• Open Source Software can be a Good Solution
Few Licensing Fee
Easy to Manage
Continuous Improvement
• Open Source HTML5 Video Player Framework
Fortunately, there are many open source HTML5 Video Player Frameworks we can use
It makes it easy to build our own HTML5 video player in our website
It offers many options to make your player easy to style and extend its functionality
1. HTML5 Video Player
“ Getting a simple player up and running for a given browser is not complicated. But once you want it to work across many browsers, it gets more complex ”
Originally written by Michael Dale for Streaming Media Magazine Navigating HTML5 article
- 3 - Advanced Multimedia Processing for WebApp on Tizen
• HTML5 Video Player Comparison (http://html5video.org/wiki/HTML5_Player_Comparison)
- 4 - Advanced Multimedia Processing for WebApp on Tizen
Video.js is an open source JavaScript library for working with web video
It has HTML/CSS video player controls
It also provides a JavaScript API that work the same with HTML5, Flash and other playback technologies
(1) Building your own HTML5 Video Player using Video.js
http://videojs.com
- 5 - Advanced Multimedia Processing for WebApp on Tizen
• Simple Embed
All the necessary files for Video.js are hosting by the Video.js CDN
Just include the following links in your page
http://videojs.com
<head>
<link href="http://vjs.zencdn.net/5.3.0/video-js.css" rel="stylesheet">
<script src="http://vjs.zencdn.net/5.3.0/video.js"></script>
</head>
<body>
<video id="my-video" class="video-js" controls preload="auto" width="640" height="264“
poster = "http://video-js.zencoder.com/oceans-clip.png"
data-setup="{}">
<source src="http://video-js.zencoder.com/oceans-clip.mp4" type='video/mp4'>
<p class="vjs-no-js">
To view this video please enable JavaScript, and consider upgrading to a web
browser that <a href=http://videojs.com/html5-video-support/ target="_blank">
supports HTML5 video</a>
</p>
</video>
</body>
• Include the Video.js javascript and CSS files in the head of your page • To self-host, the addresses need to be changed
- 6 - Advanced Multimedia Processing for WebApp on Tizen
• Get source code and build your own library
The source code is available in the Video.js git repository
(https://github.com/videojs/)
To build the library, node.js and grunt are required
node.js: An asynchronous event driven framework to build scalable network application
grunt: A task-based command line build tool for JavaScript projects
http://videojs.com
① Install grunt command line interface
② Clone the Video.js git repository
③ Install Dependencies
④ Build the library for production
Then there will be all files for production use in the dist directory
$ npm install –g grunt-cli
$ git clone https://github.com/USERNAME/video.js
$ npm install
$ grunt dist
- 7 - Advanced Multimedia Processing for WebApp on Tizen
• Skin Change by CSS
A custom skin can be created by editing HTML and CSS easily
Video.js Skin Designer is also available
http://videojs.com
- 8 - Advanced Multimedia Processing for WebApp on Tizen
• How to create a plug-in with Video.js
http://videojs.com
① Write your Javascript code with the Video.js API
② Registering the plug-in
③ Using the plug-in
function examplePlugin(options) { this.on('play', function(e) { console.log('playback has started!'); }); };
• this will be the Video.js player your plug-in is attached to
videojs.plugin(‘examplePlugin’, examplePlugin) • From this point on, your plug-in will be added to the
Video.js and will show up as a property on every instance created
videojs('vidId', { plugins: { examplePlugin: { exampleOption: true } } });
• Specify the plug-in you’d like to initialize with it and any options you want to pass to them
- 9 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
① Creating the layout of a function that will call the constructor for your plug-in
(function() {
var pluginFn= function(options){
};
videojs.plugin( 'overlayButton', pluginFn );
})();
• This function will be called by video.js when it loops through all of the registered plug-ins
• Plug-in component code will be here
• Register the plug-in with the video.js
- 10 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
② Creating the button component
(function() {
// Create the button
videojs.NewButton = videojs.Button.extend({
init: function( player, options ) {
// Initialize the button using the default constructor
videojs.Button.call( this, player, options );
}
});
// Set the text for the button
videojs.NewButton.prototype.buttonText = 'New Button';
// These are the defaults for this class.
videojs.NewButton.prototype.options_ = {};
// videojs.Button uses this function to build the class name.
videojs.NewButton.prototype.buildCSSClass = function() {
// Add our className to the returned className
return 'vjs-test-button ' + videojs.Button.prototype.buildCSSClass.call(this);
};
};
• Plug-in component code • The button is derived from the button class of Video.js • Refer to http://docs.videojs.com/docs/api/index.html
• Set up some necessary properties of the button component
- 11 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
③ Override the button click event handler
(function() {
// Create the button
.
.
.
// Set the text for the button
videojs.NewButton.prototype.buttonText = 'New Button';
// These are the defaults for this class.
videojs.NewButton.prototype.options_ = {};
// videojs.Button uses this function to build the class name.
videojs.NewButton.prototype.buildCSSClass = function() {
// Add our className to the returned className
return 'vjs-test-button ' + videojs.Button.prototype.buildCSSClass.call(this);
};
videojs.NewButton.prototype.onClick = function( e ) {
// to stop this event before it bubbles up to "window" for our event listener below.
e.stopImmediatePropagation();
alert('button is pressed');
};
};
• videojs.Button already sets up the onclick event handler • We just need to overwrite the callback
• This code just shows an alert message when the button is clicked • You can do any action if you replace the following statement with your own code
- 12 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
④ Attach the button to the video.js player
var pluginFn = function( options ) { // pass off the options to the newbutton. var newButtonComponent = new videojs.NewButton( this, options ); // Set the default position for the sharing button. Default: control-bar. Refer to HTML file var onScreen = options.onScreen || false; // Now we remove the onScreen option as it does not pertain to anything inside the button. delete options.onScreen; var NewButton; // Should the button be added to the control bar or screen? if ( onScreen ) { NewButton = this.addChild( newButtonComponent ); } else { NewButton = this.controlBar.addChild( newButtonComponent ); } };
• Instantiate the button object and then add the object to the screen or the control bar object as its child
- 13 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
⑤ Edit CSS (Cascading Style Sheets)
/* using the font and icon awesome set. */
@import url('http://maxcdn.bootstrapcdn.com/font-awesome/4.1.0/css/font-awesome.min.css');
.video-js .vjs-tech.vjs-blur {
-webkit-transition: .75s all; transition: .75s all;
-webkit-filter: blur(5px);
filter: blur(5px);}
.vjs-control.vjs-test-button {cursor: pointer;}
.vjs-control.vjs-test-button:before { font-family: FontAwesome;}
.video-js > .vjs-control.vjs-test-button {
position: absolute; top: 1em; right: 1em;}
.video-js > .vjs-control.vjs-test-button:hover:before {
text-shadow: 0 0 .3em rgba( 255,255,255,0.8);}
.vjs-control-bar .vjs-control.vjs-test-button:before { content: '\f03e';}
.video-js > .vjs-control.vjs-test-button:before {
content: '\f03e'; font-size: 2em; color: rgba(255,255,255,0.75);
text-shadow: 0 0 .5em rgba(0,0,0,0.8);}
next page
• This applies the blur filter to the video/preview etc
• Icon for our initial button • vjs-test-button is the class name defined in your Javascript file
• The styles for an on-screen button
• The styles for the button on a control bar
- 14 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
⑤ Edit CSS (Continued)
.vjs-has-started.vjs-user-inactive.vjs-playing > .vjs-control.vjs-test-button,
.video-js > .vjs-control.vjs-test-button {
display: block; visibility: hidden; opacity: 0; -webkit-transition: visibility 1s,
opacity 1s; -moz-transition: visibility 1s, opacity 1s;-o-transition: visibility 1s,
opacity 1s; transition: visibility 1s, opacity 1s;
}
.vjs-has-started > .vjs-control.vjs-test-button {
display: block; visibility: visible; opacity: 1; -webkit-transition: visibility 0.1s,
opacity 0.1s; -moz-transition: visibility 0.1s, opacity 0.1s;
-o-transition: visibility 0.1s, opacity 0.1s; transition: visibility 0.1s, opacity 0.1s;
}
• To show/hide the onscreen button. Duplicate the showing / hiding of the control bar
- 15 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
http://tastybytes.net/creating-a-video-js-plugin/
⑥ Edit HTML
<!DOCTYPE html>
<head>
<title>Overlay Button</title>
<link href="node_modules/video.js/dist/video-js/video-js.css" rel="stylesheet">
<script src="node_modules/video.js/dist/video-js/video.dev.js"></script>
<link href="OverlayButton.css" rel="stylesheet" />
<script src="OverlayButton.js"></script>
</head>
<body>
<div class="video-container">
<video id="video_1" class="video-js vjs-default-skin" preload="auto“
controls width="640" height="360" poster="">
<source src="http://video-js.zencoder.com/oceans-clip.mp4" type='video/mp4'>
</video>
<script>
videojs('video_1', {
"techOrder": ["html5", "flash"],
"plugins":{
"overlayButton": {"onScreen": false}
}
}, function(){
console.log(" Player (this) is initialized and ready.");
});
</script>
</div>
</body>
• Include your css and Javascript
• To get the blur effect
• Use the overlayButton plug-in with control bar mode
- 16 - Advanced Multimedia Processing for WebApp on Tizen
• Practice – Overlay Button Plug-in
⑦ Screen Shot
- 17 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 1 – Thumbnails on progress bar
It displays thumbnails when the cursor is hovering over the progress bar
The thumbnails and their intervals to display can be configured in your HTML
The source code is available in https://github.com/brightcove/videojs-thumbnails
- 18 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 1 - Thumbnails on progress bar – How to use
<head>
<title>Video.js Thumbnails Example</title>
<link href="node_modules/video.js/dist/video-js/video-js.css" rel="stylesheet">
<link href="videojs.thumbnails.css" rel="stylesheet">
<script src="node_modules/video.js/dist/video-js/video.js"></script>
<script src='videojs.thumbnails.js'></script>
</head>
① Include the videojs.thumbnail javascript and CSS files in the head of your page
② pass the series of thumbnail properties to the plug-in as the first argument
<script>
var video = videojs('video');
video.thumbnails({
/**
display chapter_1.png if the user is to hover over the the beginning of
the progress bar for 10 seconds
**/
0: { src: 'chapters/chapter_1.png', width: '120px'},
10: { src: 'chapters/chapter_2.png', width: '120px'},
20: { src: 'chapters/chapter_3.png', width: '120px'},
30: { src: 'chapters/chapter_4.png', width: '120px'},
40: { src: 'chapters/chapter_5.png', width: '120px’}
});
</script>
- 19 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 1 - Thumbnails on progress bar– Source Analysis
videojs.plugin('thumbnails', function(options) {
var div, settings, img, player, progressControl, duration, moveListener, moveCancel;
settings = extend({}, defaults, options);
.
.
img = document.createElement('img');
div.appendChild(img);
img.src = settings['0'].src;
img.className = 'vjs-thumbnail';
extend(img.style, settings['0'].style);
.
.
// update the thumbnail while hovering
progressControl.on('mousemove', moveListener);
progressControl.on('touchmove', moveListener);
.
.
moveListener = function(event) {
.
.
mouseTime = Math.floor((left - progressControl.el().offsetLeft)
/ progressControl.width() * duration);
for (time in settings) {
if (mouseTime > time)
active = Math.max(active, time);
}
setting = settings[active];
if (setting.src && img.src != setting.src) {
img.src = setting.src;
}
.
.
}
}
• Create the thumbnail component • Initially, the image source of the component is the first element in the series of properties
• Register the listener to update thumbnail while hovering
• Set the current thumbnail to the image component to display
• Calculate the index of thumbnail to display by the scroll offset
- 20 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 2 – Play List UI
It displays the list of videos with their thumbnails and brief description
It plays tracks by selecting them
The source code is available in https://github.com/brightcove/videojs-playlist-ui
- 21 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 2 – Play List UI – How to use
<link href="node_modules/video.js/dist/video-js.css" rel="stylesheet">
<link href="dist/videojs-playlist-ui.vertical.css" rel="stylesheet">
<script src="node_modules/video.js/dist/video.js"></script>
<script src="node_modules/videojs-playlist/dist/global.js"></script>
<script src="dist/videojs-playlist-ui.js"></script>
① Include the videojs-playlist-ui javascript and CSS files in the head of your page
② Display the play list
③ Initialize the plug-in and pass the playlist as the first argument
<script>
var player = videojs('video');
player.playlist([
//description for the first item
{
name: 'Disney\'s Oceans',
description: 'Explore the depths of our planet\'s oceans.’,
duration: 45,
sources:[
{ src: 'http://vjs.zencdn.net/v/oceans.mp4', type: 'video/mp4' }],
thumbnail: false
},
//description for the second item
…
]);
player.playlistUi();
</script>
• global.js is to manage multiple videos or multiple audio tracks • Videojs-playlist-ui.js is to display the list on your web page
<ol class="vjs-playlist"></ol> • The playlist menu will be built automatically here
- 22 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 2 – Play List UI – Source Analysis
var PlaylistMenuItem = (function (_Component) {
function PlaylistMenuItem(player, playlistItem, settings){
...
this.on(["click", "tap"], function (event) {
player.playlist.currentItem(indexOf(player.playlist(), _this.item));
if (settings.playOnSelect) {
player.play(); } });
...
}
...
}
var PlaylistMenu = (function (_Component2) {
function PlaylistMenu(player, settings) {
...
for (var i = 0; i < playlist.length; i++) {
var item = new PlaylistMenuItem(this.player_,{item: playlist[i]}, this.options_);
this.items.push(item);
this.addChild(item);
}
...
}
...
}
var playlistUi = function playlistUi(options) {
var player = this;
...
player.playlistMenu = new PlaylistMenu(player, settings);
...
// register components
videojs.registerComponent("PlaylistMenu", PlaylistMenu);
videojs.registerComponent("PlaylistMenuItem", PlaylistMenuItem);
// register the plugin
videojs.plugin("playlistUi", playlistUi);
}
• Play the item user selects in the click event handler of each item
• Create all PlaylistMenuItem
• Initialize the plug-in and register components to the player
- 23 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 3 – Resume Play
A video.js plug-in to resume playback of a video at the point in time it was left
The source code is available in https://github.com/sprice/videojs-resume
- 24 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 3 – Resume Play – How to use
<link href="node_modules/video.js/dist/video-js.min.css" rel="stylesheet">
<link href="dist/videojs-resume.min.css" rel="stylesheet">
<script src="node_modules/video.js/dist/video.min.js" type="text/javascript"></script>
<script src="node_modules/store/store.min.js" type="text/javascript"></script>
<script src="dist/videojs-resume.min.js" type="text/javascript"></script>
① Include the videojs-resume javascript and CSS files in the head of your page
② Initialize the plug-in and pass the UUID of a video
<script type="text/javascript">
var player = videojs(‘example-video’);
player.Resume({
uuid: ‘UNIQUE_VIDEO_IDENTIFIER’,
title: ‘Resume?’,
resumeButtonText: ‘Yes’,
cancelButtonText: ‘No’
});
</script>
• Store.js is available in https://github.com/marcuswestin/store.js
- 25 - Advanced Multimedia Processing for WebApp on Tizen
• Useful Plug-ins 3 – Resume Play – Source Analysis
class ResumeButton extends Button {
...
handleClick() {
this.player_.resumeModal.close();
this.player_.currentTime(this.resumeFromTime);
this.player_.play();
}
}
class ResumeCancelButton extends Button {
...
handleClick() {
this.player_.resumeModal.close();
store.remove(this.options_.key);
}
}
const Resume = function(options) {
let player = this;
let videoId = options.uuid;
let title = options.title || 'Resume from where you left off?';
let resumeButtonText = options.resumeButtonText || 'Resume';
let cancelButtonText = options.cancelButtonText || 'No Thanks';
let key = 'videojs-resume:' + videoId;
player.on('timeupdate', function() {
store.set(key, player.currentTime());
});
player.on('ended', function() {
store.remove(key);
});
...
player.ready(function() {
let resumeFromTime = store.get(key);
if(resumeFromTime )
//show the resume dialogue
}
}
• Store the current playing time to the local storage using store.js
• Remove the key when it reached the end
• Check if there are the key stored previously
• Resume the play at the point in a given time when clicking the resume button
• Remove the key when clicking the cancel button
- 26 - Advanced Multimedia Processing for WebApp on Tizen
• JavaScript Image Processing Libraries help you to manipulate, apply various effects to images on your web site
• JSFeat
Open source (MIT License)
A javascript library that implements some advanced image processing in real time
• Clmtrackr
Open source (MIT License)
A javascript library for fitting facial models to faces in videos or images
• CamanJS
Open source (Free as in beer)
A javascript library that manipulate images using the HTML5 canvas
• OpenCV JS
Latest commit was done two years ago
OpenCV modules compiled to JS through Emscripten
2. JavaScript Image Processing Library
※ Emscripten is a source to source compiler that runs as a back end to the LLVM compiler and produces a subset of Javascript known as asm.js from wikipedia
- 27 - Advanced Multimedia Processing for WebApp on Tizen
JSFeat is an open source JavaScript image processing library
It implements some advanced image processing algorithms in real time
It supports
Grayscale, Box Blur, Gaussian Blur, Equalize Histogram, …
Canny Edges, HAAR/BBF Object Detector, Lucas-Kanade Optical Flow, Fast Corners Feature Detector, …
(1) JSFeat
http://inspirit.github.io/jsfeat/
Fast Corners Feature Detector Demo
- 28 - Advanced Multimedia Processing for WebApp on Tizen
• Data Structures of JSFeat
Most of JSFeat methods relies on custom data structure
matrix_t
the core structure of JSFeat used as image representation
data_t
Just a wrapper for JavaScript ArrayBuffer
pyramid_t
A structure to wrap several matrix_t instances
keypoint_t
2D point with coordinates, level and score properties
var my_matrix = new jsfeat.matrix_t(columns, rows, data_type, data_buffer=undefined);
// single channel unsigned char var data_type = jsfeat.U8_t | jsfeat.C1_t; // 2 channels 32 bit integer var data_type = jsfeat.S32_t | jsfeat.C2_t; // 3 channels 32 bit float var data_type = jsfeat.F32_t | jsfeat.C3_t;
var my_data = new jsfeat.data_t(size_in_bytes, buffer = undefined);
var levels = 3, start_width = 640, start_height = 480, data_type = jsfeat.U8_t | jsfeat.C1_t; var my_pyramid = new jsfeat.pyramid_t(levels); my_pyramid.allocate(start_width, start_height, data_type); var level_0 = my_pyramid.data[0]; // cols = 640, rows = 480 var level_1 = my_pyramid.data[1]; // cols = 320, rows = 240 var level_2 = my_pyramid.data[2]; // cols = 160, rows = 120
var my_point = new jsfeat.keypoint_t(x = 0, y = 0, score = 0, level = 0);
- 29 - Advanced Multimedia Processing for WebApp on Tizen
• Math of JSFeat
It has several math utilities and supports various matrix operations
get_gaussian_kernel
Calculate gaussian kernel coefficients using specified options
qort, median
Sorting and median value calculation of provided Array
transpose, multiply
multiply_ABt, multiply_AtB,
multiplay_AAt, multiply_AtA
invert_3x3, multiply_3x3 (Quick manipulation)
max3x3_determinant
var kernel_size = 5, sigma = 0, kernel_array = [], data_type = jsfeat.F32_t; jsfeat.math.get_gaussian_kernel(kernel_size, sigma, kernel_array, data_type);
var arr = [10,2,1,0,0,4,6,1,3,8,5,3]; var cmp_numeric = function(a, b) { return (a < b); } jsfeat.math.qsort(arr, 0, arr.length-1, cmp_numeric); var median = jsfeat.math.median(arr, 0, arr.length-1);
jsfeat.matmath.transpose(At:matrix_t, A:matrix_t); jsfeat.matmath.multiply(C:matrix_t, A:matrix_t, B:matrix_t);
jsfeat.matmath.multiply_xxx(C:matrix_t, A:matrix_t, B:matrix_t);
jsfeat.matmath.multiply_xxx(C:matrix_t, A:matrix_t);
jsfeat.matmath.invert_3x3(from:matrix_t, to:matrix_t); jsfeat.matmath.multiply_3x3(C:matrix_t, A:matrix_t, B:matrix_t);
var determinant = jsfeat.matmath.mat3x3_determinant(M:matrix_t);
- 30 - Advanced Multimedia Processing for WebApp on Tizen
• Linear Algebra of JSFeat
lu_solve
Solve the linear equation Ax=B using Gaussian elimination
svd_decompose
Singular value decomposition
eigenVV
Compute eigenvalues and eigenvectors of a symmetric matrix
// A and B modified and result output in B jsfeat.linalg.lu_solve(A:matrix_t, B:matrix_t);
// U - the left orthogonal matrix // W - vector of singular values // V - the right orthogonal matrix // options - jsfeat.SVD_U_T and/or jsfeat.SVD_V_T to return transposed U and/or V jsfeat.linalg.svd_decompose(A:matrix_t,W:matrix_t,U:matrix_t,V:matrix_t,options);
jsfeat.linalg.eigenVV(A:matrix_t, EigenVectors:matrix_t, EigenValues:matrix_t); // you can ask for Vectors or Values only jsfeat.linalg.eigenVV(A:matrix_t, null, EigenValues:matrix_t); jsfeat.linalg.eigenVV(A:matrix_t, EigenVectors:matrix_t, null);
- 31 - Advanced Multimedia Processing for WebApp on Tizen
• JSFeat Demo Application
To run their demo application, it needs web cam and a browser that supports Web RTC
All image processing operations of JSFeat work in real time from a web cam
① Javascript files they are using in the demo
<script type="text/javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script> <script type="text/javascript" src="http://inspirit.github.io/jsfeat/js/jsfeat-min.js"></script> <script type="text/javascript" src="http://inspirit.github.io/jsfeat/js/compatibility.js"></script> <script type="text/javascript" src="http://inspirit.github.io/jsfeat/js/profiler.js"></script> <script type="text/javascript" src="http://inspirit.github.io/jsfeat/js/dat.gui.min.js"></script>
• jQuery Javascript Library
• JSFeat Javascript Library
• A shim for requestAnimationFrame and getUserMedia functions to support various web browsers
• To profile the performance of JSFeat functions
• A lightweight controller library from Data Arts Team of Google
- 32 - Advanced Multimedia Processing for WebApp on Tizen
• JSFeat Demo Application
② HTML
③ Initialization part
<video id="webcam" width="640" height="480" style="display:none;"></video> <div style=" width:640px;height:480px;margin: 10px auto;"> <canvas id="canvas" width="640" height="480"></canvas> <div id="no_rtc" class="alert alert-error" style="display:none;"></div> <div id="log" class="alert alert-info"></div> </div>
• To get source images from your web cam
• To draw output images applying image processing operations
$(window).load(function() { var video = document.getElementById('webcam'); var canvas = document.getElementById('canvas'); ... var onDimensionsReady = function(width, height) { ctx = canvas.getContext('2d'); img_u8 = new jsfeat.matrix_t(width, height, jsfeat.U8C1_t); compatibility.requestAnimationFrame(tick); }; ... compatibility.getUserMedia({video: true}, function(stream); ... var stat = new profiler(); ... }
• This function is called after the entire window to be loaded
• If video is available 1. Get the context of the canvas 2. Prepare the image buffer 3. request that the browser call
the tick function to update animation before the next paint
• Get data from cameras without the use of plug-ins
- 33 - Advanced Multimedia Processing for WebApp on Tizen
• JSFeat Demo Application
④ Image Processing part
function tick() { compatibility.requestAnimationFrame(tick); ... if (video.readyState === video.HAVE_ENOUGH_DATA) { ctx.drawImage(video, 0, 0, width, height); var imageData = ctx.getImageData(0, 0, width, height); var data_u32 = new Uint32Array(imageData.data.buffer); var alpha = (0xff << 24); var i = img_u8.cols*img_u8.rows, pix = 0; while(--i >= 0) { pix = img_u8.data[i]; data_u32[i] = alpha | (pix << 16) | (pix << 8) | pix; } ctx.putImageData(imageData, 0, 0); }; }
• This function is called when the browser is ready to paint next frame
• Get an image data object representing the pixel data for the area of the canvas
Put the image processing code here
• Render image processing results back to the canvas
- 34 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – Canny Edge Detector
Well known edge detection algorithm developed by John F. Canny in 1986
jsfeat.imgproc.canny(source:matrix_t, dest:matrix_t, low_threshold, high_threshold);
① Interface
② Demo Page
http://inspirit.github.io/jsfeat/sample_canny_edge.html
- 35 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – Canny Edge Detector
③ Code Explanation
function tick() {
. . .
ctx.drawImage(video, 0, 0, width, height);
var imageData = ctx.getImageData(0, 0, width, height);
options.blur_radius = 2;
options.low_threshold = 20;
options.high_threshold = 50;
jsfeat.imgproc.grayscale(imageData.data, width, height, img_u8);
var r = options.blur_radius|0;
var kernel_size = (r+1) << 1;
jsfeat.imgproc.gaussian_blur(img_u8, img_u8, kernel_size, 0);
jsfeat.imgproc.canny(img_u8, img_u8, options.low_threshold|0, options.high_threshold|0);
// render result back to canvas
. . .
}
• Optimal initial values for each variables
• Convert color array input to grayscale using Y = 0.299*R + 0.587*G + 0.114*B formula
• Reduce noise with a given kernel size
• Run canny edge detection and get back the results to the img_u8 buffer
- 36 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – FAST Corners Detector
FAST(Features from Accelerated Segment Test) algorithm for corner detection was proposed by Edward Rosten and Tom Drummond
jsfeat.fast_corners.set_threshold(threshold); var count = jsfeat.fast_corners.detect(img:matrix_t, corners:Array, border = 3);
① Interface
② Demo Page
http://inspirit.github.io/jsfeat/sample_fast_corners.html
- 37 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – FAST Corners Detector
③ Code Explanation
//in initialization part
{
img_u8 = new jsfeat.matrix_t(640, 480, jsfeat.U8_t | jsfeat.C1_t);
corners = [];
var i = 640*480;
while(--i >= 0) {
corners[i] = new jsfeat.keypoint_t(0,0,0,0);
}
threshold = 20;
jsfeat.fast_corners.set_threshold(threshold);
}
function tick() {
. . .
ctx.drawImage(video, 0, 0, width, height);
var imageData = ctx.getImageData(0, 0, width, height);
jsfeat.imgproc.grayscale(imageData.data, width, height, img_u8);
var count = jsfeat.fast_corners.detect(img_u8, corners, 5);
var data_u32 = new Uint32Array(imageData.data.buffer);
var pix = (0xff << 24) | (0x00 << 16) | (0xff << 8) | 0x00;
for(var i=0; i < count; ++i){
var x = corners[i].x; var y = corners[i].y;
var off = (x + y * width);
data_u32[off] = pix;
data_u32[off-1] = pix; data_u32[off+1] = pix;
data_u32[off-width] = pix; data_u32[off+width] = pix;
}
// render result back to canvas
. . .
}
• Buffer to store the result of FAST corner detection
• Render corners on the original frame image
- 38 - Advanced Multimedia Processing for WebApp on Tizen
• Example 3 – HAAR Cacades Object Detector
A object detection algorithm was proposed by Paul Viola and improved by Rainer Lienhart
var rects:Array = jsfeat.haar.detect_multi_scale(int_sum:Array, int_sqsum:Array, int_tilted:Array, int_canny_sum:Array, width, height, classifier, scale_factor = 1.2, scale_min = 1);
① Interface
② Demo Page
http://inspirit.github.io/jsfeat/sample_haar_face.html
• int_sum - integral of the source image • int_sqsum - squared integral of the source image
• int_tilted - tilted integral of the source image
• int_canny_sum - integral of canny source image or undefined
• width - width of the source image • height - height of the source image • classifier - haar cascade classifier • scale_factor - how much the image size is reduced at each image scale
• scale_min - start scale • rects - rectangles representing detected object
- 39 - Advanced Multimedia Processing for WebApp on Tizen
• Example 3 – HAAR Cacades Object Detector
③ Code Explanation
//in initialization part
{
. . .
options.min_scale = 2;
options.scale_factor = 1.15;
options.edges_density = 0.13;
var max_work_size = 160;
var scale = Math.min(max_work_size/videoWidth, max_work_size/videoHeight);
var w = (videoWidth*scale)|0;
var h = (videoHeight*scale)|0;
img_u8 = new jsfeat.matrix_t(w, h, jsfeat.U8_t | jsfeat.C1_t);
edg = new jsfeat.matrix_t(w, h, jsfeat.U8_t | jsfeat.C1_t);
work_canvas = document.createElement('canvas');
work_canvas.width = w;
work_canvas.height = h;
work_ctx = work_canvas.getContext('2d');
ii_sum = new Int32Array((w+1)*(h+1));
ii_sqsum = new Int32Array((w+1)*(h+1));
ii_tilted = new Int32Array((w+1)*(h+1));
ii_canny = new Int32Array((w+1)*(h+1));
. . .
}
• Set work size 160 to improve the speed of processing
• Create work_canvas to get scale down image data
- 40 - Advanced Multimedia Processing for WebApp on Tizen
• Example 3 – HAAR Cacades Object Detector
③ Code Explanation (Continued)
function tick() {
. . .
ctx.drawImage(video, 0, 0, width, height);
var imageData = ctx.getImageData(0, 0, width, height);
jsfeat.imgproc.grayscale(imageData.data, width, height, img_u8);
jsfeat.imgproc.equalize_histogram(img_u8, img_u8);
jsfeat.imgproc.compute_integral_image(img_u8, ii_sum, ii_sqsum, ii_tilte);
jsfeat.imgproc.canny(img_u8, edg, 10, 50);
jsfeat.imgproc.compute_integral_image(edg, ii_canny, null, null);
jsfeat.haar.edges_density = options.edges_density;
var rects = jsfeat.haar.detect_multi_scale(ii_sum, ii_sqsum, ii_tilted,
ii_canny, img_u8.cols, img_u8.rows,
jsfeat.haar.frontalface, options.scale_factor, options.min_scale);
rects = jsfeat.haar.group_rectangles(rects, 1);
draw_faces(ctx, rects, canvasWidth/img_u8.cols, 1);
}
function draw_faces(ctx, rects, sc, max) {
var on = rects.length;
if(on && max)
jsfeat.math.qsort(rects, 0, on-1, function(a,b){return (b.confidence<a.confidence);})
var n = max || on;
n = Math.min(n, on);
var r;
for(var i = 0; i < n; ++i) {
r = rects[i];
ctx.strokeRect((r.x*sc)|0,(r.y*sc)|0,(r.width*sc)|0,(r.height*sc)|0);
}
}
• Generate a integral image (sum, mean, standard deviation) • refer to https://en.wikipedia.org/wiki/Summed_area_table
• Groups the object candidate rect
• Draw rectangles on the original frame image
- 41 - Advanced Multimedia Processing for WebApp on Tizen
An open source javascript library for fitting facial models to faces in videos or images
It implements constrained local models fitted by regularized landmark mean-shift
It tracks a face and outputs the coordinate positions of the face model as an array, following the numbering of the model below
(2) clmtrackr
https://github.com/auduno/clmtrackr
- 42 - Advanced Multimedia Processing for WebApp on Tizen
• Application structure using clmtrackr.js
<script src=“clmtrackr.js”></script> <script src=“model/model_pca_20_sv.js”></script>
① Include clmtrackr.js script in your HTML file
② Initialization
③ Start tracking
④ Getting the points of the currently fitted model
⑤ Drawing the currently fitted model on a given canvas
var ctracker = new clm.tracker(); ctracker.init(pModel);
• Choose one from the followings
• pModel is defined in model_xx.js
Ctrackr.start(Element) • Element is a canvas or video element
var positions = ctracker.getCurrentPosition(); • getCurrentPosition returns the position as an array of the fitted facial model
var drawCanvas = document.getElementsById('somecanvas'); ctracker.draw(drawCanvas);
• model_pca_10_svm.js : SVM kernel for classifiers, 10 components PCA • model_pca_20_svm.js : SVM kernel for classifiers, 20 components PCA (default) • model_spca_10_svm.js : SVM kernel for classifiers, 10 components Sparse PCA • model_spca_20_svm.js : SVM kernel for classifiers, 20 components Sparse PCA • model_pca_10_mosse.js : MOSSE filter for classifiers, 10 components PCA • model_pca_20_mosse.js : MOSSE filter for classifiers, 20 components PCA A model with fewer components will be slightly faster, with some loss of precision. The MOSSE filter classifiers will run faster than SVM kernels on computers without support for webGL, but has slightly poorer fitting.
- 43 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – clm_video
This application tracks your face based on the video input from your webcam
clm.init( model ) clm.start( element, box ) clm.getCurrentPosition( ) clm.draw( context)
① Interface
② Demo Page
http://auduno.github.io/clmtrackr/clm_video.html
- 44 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – clm_video
③ Code Explanation
<script>
var vid = document.getElementById('videoel');
var overlay = document.getElementById('overlay');
var overlayCC = overlay.getContext('2d');
var ctrack = new clm.tracker({useWebGL : true});
ctrack.init(pModel); //uses model_pca_20_svm.js as a facial model
. . .
navigator.getUserMedia({video : true}, function( stream );
vid.src= window.URL.createObjectURL(stream);
. . .
function startVideo() {
vid.play();
ctrack.start(vid);
drawLoop();
}
function drawLoop() {
requestAnimFrame(drawLoop);
overlayCC.clearRect(0, 0, 400, 300);
if (ctrack.getCurrentPosition()) {
ctrack.draw(overlay);
}
}
<script>
• Initialization
<div id="container">
<video id="videoel" width="400" height="300" preload="auto" loop></video>
<canvas id="overlay" width="400" height="300"></canvas>
</div> • videoel for webcam • Overlay for drawing fitting model
• Setup webcam
• Start tracking
• Get the points of current fitted model and draw them on the overlay canvas
- 45 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – clm_emotiondetection
This application tracks your face based on the video input from your webcam
emotionClassifier.init( emotionmodel ) emotionClassifier.meanPredict(currentparameters)
① Interface
② Demo Page
The screen shot is from http://www.theatlantic.com/technology/archive/2014/01/this-app-reads-your-emotions-on-your-face/282993/
• model_pca_20_svm_emotionDetection
• Returns probabilities for angry, sad, surprised and happy respectively
- 46 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – clm_emotiondetection
③ Code Explanation
<script>
var vid = document.getElementById('videoel');
var overlay = document.getElementById('overlay');
var overlayCC = overlay.getContext('2d');
var ctrack = new clm.tracker({useWebGL : true});
ctrack.init(pModel); //uses model_pca_20_svm_emotionDetection.js as a model
var ec = new emotionClassifier();
ec.init(emotionModel);
var emotionData = ec.getBlank();
. . .
navigator.getUserMedia({video : true}, function( stream );
vid.src= window.URL.createObjectURL(stream);
. . .
function startVideo() {
vid.play();
ctrack.start(vid);
drawLoop();
}
//next page
• Initialization
<div id="container">
<video id="videoel" width="400" height="300" preload="auto" loop></video>
<canvas id="overlay" width="400" height="300"></canvas>
</div> • videoel for webcam • Overlay for drawing fitting model
• Setup emotion classifier • getBlank returns four emotions with zero probability
• Setup webcam
<script src="../models/model_pca_20_svm_emotionDetection.js"></script>
• Start tracking
- 47 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – clm_emotiondetection
③ Code Explanation (Continued)
. . .
function drawLoop() {
requestAnimFrame(drawLoop);
overlayCC.clearRect(0, 0, 400, 300);
if (ctrack.getCurrentPosition()) {
ctrack.draw(overlay);
}
var cp = ctrack.getCurrentParameters();
var er = ec.meanPredict(cp);
if (er) {
updateData(er);
for (var i = 0;i < er.length;i++) {
if (er[i].value > 0.4) {
document.getElementById('icon'+(i+1)).style.visibility = 'visible';
} else {
document.getElementById('icon'+(i+1)).style.visibility = 'hidden';
}
}
}
</script>
• Predict user emotion based on current fitted model
• Display the emotion data
• Display an emotion icon when its probability is more than 0.4
• Get the points of current fitted model and draw them on the overlay canvas
- 48 - Advanced Multimedia Processing for WebApp on Tizen
• JavaScript Audio Processing Libraries help you to manipulate, apply various effects to audio on your web site
• P5.Sound
Open source (LGPL)
a JavaScript library that provides Web Audio functionality
• Buzz
Open source (MIT License)
A small but powerful Javascript library that allows you to easily take advantage of the new HTML5 audio element
• SoundJS
Open source (MIT License)
A Javascript library for working with Audio
It provides a consistent API for loading and playing audio on different browsers and devices
• WAD
Open source (MIT License)
A Javascript library for manipulating audio using the new HTML5 Web Audio API
3. JavaScript Audio Processing Library
- 49 - Advanced Multimedia Processing for WebApp on Tizen
P5.sound is a JavaScript library that provides Web Audio functionality including audio input, playback, recoding, manipulation, effects, sequencing, analysis and synthesis
Load and play sound files, manipulate playback
Get the current volume of a sound
Get sound from an input source like a computer mic
Analyze the frequency of sound
Waveforms for playback & modulation
White, pink or brown noise generator
Trigger an attack/release envelope, or modulate other parameters
Simulate the sound of real physical spaces w/ convolution
Filter the frequency range of a sound
(1) P5.sound in P5.js
https://github.com/processing/p5.js-sound
- 50 - Advanced Multimedia Processing for WebApp on Tizen
• Application structure using P5.js
<script src=“p5.min.js”></script> <script src=“p5.sound.min.js”></script>
① Incude p5.js script in your HTML file
② Then the browser call the setup() function when the program start
③ The draw() function is called directly after setup()
function setup() { // define initial environment properties }
• There can be only be one setup() function for each program and it should not be called again after its initial execution
function draw() { // draw something on canvas }
• This function continuously executes until the program is stopped or noLoop() is called
• loop() causes the code inside draw() to resume executing continuously
• frameRate() controls the number of times draw() execution
<head> <script type="text/javascript“ src="p5.min.js"></script <script type="text/javascript“ src="p5.sound.min.js"></script> <script type="text/javascript" src="sketch.js"></script> </head>
Demo.html
function setup(){ //initialize something } function draw(){ //draw something }
sketch.js
- 51 - Advanced Multimedia Processing for WebApp on Tizen
• P5. Sound Library
Class Description
P5.SoundFile Load and play sound files
P5.Amplitude Get the current volume of a sound
P5.AudioIn Get sound from an input source, typically a computer microphone.
P5.FFT Analyze the frequency of sound. Returns results from the frequency spectrum or time domain (waveform)
P5.Oscillator Generate Sine, Triangle, Square and Sawtooth waveforms
P5.Env An Envelope is a series of fades over time
P5.Decay A delay effect with parameters for feedback, delayTime, and lowpass filter
P5.Filter Filter the frequency range of a sound
P5.Reverb Add reverb to a sound by specifying duration and decay
P5.Convolver simulate the sound of real physical spaces through convolution
P5.SoundRecorder Record sound for playback / save the .wav file
P5.Phrase P5.Part P5.Score
Compose musical sequences
- 52 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – mic FFT
new p5.FFT([smoothing],[bins])
① Interface of p5.FFT
Refer to http://p5js.org/reference/#/p5.FFT
• [smoothing]: Smooth results of Freq Spectrum. 0.0 < smoothing < 1.0. Defaults to 0.8
• [bins]: Length of resulting array. Must be a power of two between 16 and 1024. Defaults to 1024
• It returns FFT object
1. FFT.setInput([source])
2. FFT.waveform()
3. FFT.analyze(frequency1, [frequency2])
4. FFT.getEnergy()
5. FFT.smooth(smoothing)
• Set the input source for the FFT analysis • [source]: p5.sound object
• Returns an array of amplitude values along the time domain
• Returns the amount of energy at a specific frequency, or the average amount of between two frequencies
• Frequency1: Number or string(lowMid, mid, highMid, treble) • [Frequency2]: Number. optional
• Returns an array of amplitude values along the frequency domain
• Smooth FFT analysis by averaging with the last analysis frame • Smoothing: Number: 0.0 < smoothing < 1.0. Defaults to 0.8
※ [] means optional parameter
- 53 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – mic FFT
② Code Explanation
var mic, fft;
function setup() {
createCanvas(512,400);
noStroke();
fill(0,255,255);
mic = new p5.AudioIn();
mic.start();
fft = new p5.FFT();
fft.setInput(mic);
}
function draw() {
background(200);
var spectrum = fft.analyze();
beginShape();
vertex(0, height);
for (i = 0; i<spectrum.length; i++) {
vertex(i, map(spectrum[i], 0, 255, height, 0) );
}
endShape();
}
• Get audio from an input, i.e. your computer’s microphone
• Get an array of amplitude values across the frequency spectrum and the number of bins is 255
• Draw spectrum
• Create a FFT object and set mic object as the input
- 54 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – Reverb_basic
new p5.Reverb()
① Interface of p5.Reverb
Refer to http://p5js.org/reference/#/p5.Reverb
• Reverb adds depth to a sound through a large number of decaying echoes
1. Reverb.process(src, [seconds], [decayrate], [reverse])
2. Reverb.amp(volume, [ramptime], [timeFromNow])
3. Reverb.connect (unit)
4. Reverb.disconnect
• Connect a source to the reverb and assign reverb parameters • Src: p5.sound object
• Set the output level of the delay effect • Volume: amplitude between 0 and 1.0
• Disconnect all outpu
• Send output to a p5.sound or web audio object
※ [] means optional parameter
- 55 - Advanced Multimedia Processing for WebApp on Tizen
• Example 2 – Reverbe_basic
② Code Explanation
var sound, reverb;
function preload() {
soundFormats('mp3', 'ogg');
soundFile = loadSound('../files/Damscray_-_Dancing_Tiger_02');
// disconnect the default connection
// so that we only hear the sound via the reverb.process
soundFile.disconnect();
}
function setup() {
createCanvas(720,100);
background(0);
reverb = new p5.Reverb();
// sonnects soundFile to reverb with a
// reverbTime of 6 seconds, decayRate of 0.2%
reverb.process(soundFile, 6, 0.2);
reverb.amp(3); // turn it up!
}
function mousePressed() {
soundFile.play();
}
• Load mp3 or ogg file
- 56 - Advanced Multimedia Processing for WebApp on Tizen
It simplifies the process of creating, playing, and manipulating audio, either for real-time playback, or at scheduled intervals Load and play sound files, manipulate playback
sound creation
panning
filters
pitching
detunes
delays
reverberation
(2) WAD
https://github.com/rserota/wad
- 57 - Advanced Multimedia Processing for WebApp on Tizen
• Application structure using WAD.js
<script src="wad.min.js"></script>
① Incude wad.js script in your HTML file
② Create WAD object and just play it
③ Constructor Arguments
var soundobject = new Wad({source:'bell.wav'}) or var soundobject = new Wad({source:'sawtooth'}) soundobject.play(); soundobject.stop();
• Single audio file
var saw = new Wad({ source : 'sawtooth', volume : 1.0,// Peak volume can range from 0 to an arbitrarily high number loop : false, // If true, the audio will loop pitch : 'A4', detune : 0, panning : -.5 //Possible values are from 1 to -1 env:{ . . . } filter:{ . . . } reverb:{ . . . } . . . })
• Create oscillators by specifying one of 'sine', 'square', 'sawtooth', or 'triangle' as the source
• Envelop setting • Attack, decay, sustain, hold, release attributes are available
• Filter setting • Type, frequency, q-factor, env attributes are available
• reverb setting • Wet, impulse attributes are available
- 58 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – growingDisharmony
new p5.FFT([smoothing],[bins])
① Interface of p5.FFT
Refer to http://p5js.org/reference/#/p5.FFT
• [smoothing]: Smooth results of Freq Spectrum. 0.0 < smoothing < 1.0. Defaults to 0.8
• [bins]: Length of resulting array. Must be a power of two between 16 and 1024. Defaults to 1024
• It returns FFT object
1. FFT.setInput([source])
2. FFT.waveform()
3. FFT.analyze(frequency1, [frequency2])
4. FFT.getEnergy()
5. FFT.smooth(smoothing)
• Set the input source for the FFT analysis • [source]: p5.sound object
• Returns an array of amplitude values along the time domain
• Returns the amount of energy at a specific frequency, or the average amount of between two frequencies
• Frequency1: Number or string(lowMid, mid, highMid, treble) • [Frequency2]: Number. optional
• Returns an array of amplitude values along the frequency domain
• Smooth FFT analysis by averaging with the last analysis frame • Smoothing: Number: 0.0 < smoothing < 1.0. Defaults to 0.8
※ [] means optional parameter
- 59 - Advanced Multimedia Processing for WebApp on Tizen
• Example 1 – growingDisharmony
① Code Explanation
<script>
var notes = [];
setTimeout(function(){ start();},100);
function start(){
for(var i = 0; i < 20; ++i){
var f = 10;
for(var ff = 0; ff < i; ++ff)
f += f;
addNote(f);
}
play();
}
function play(){
for(var i = 0; i < notes.length; ++i){
notes[i].play({ volume : (1.0 - i*0.05) });
notes[i].pitch += Math.random() * 100.0 - 50.0 ;
}
setTimeout(play,600);
}
function addNote(frequ){
var note = new Wad({
source : 'sine',
pitch : frequ,
env : {
attack : .02,decay : .1,sustain : .9,hold : .4,release : .1
}
});
notes.push(note);
}
</script>
• Create notes to play
• Play notes every 0.6 seconds with changing the pitch
• Sound arguments
- 60 - Advanced Multimedia Processing for WebApp on Tizen
Emscripten is an LLVM-based project that compiles C and C++ into highly-optimizable JavaScript in asm.js format. This lets you run C and C++ on the web at near-native speed, without plugins.
http://kripken.github.io/emscripten-site/index.html
Videoconverter.js
The goal is converting any video file into another video format in your browser
ffmpeg has been compiled into javascript through Emscripten in this project
refer to https://bgrins.github.io/videoconverter.js/
Opencv.js
An engineer tried to compile Opencv into javascript through Emscripten but the project may not be active anymore
refer to https://github.com/blittle/opencvjs
• Introduction to Emscripten Project