As the lead of the team of VLC, we've been discussing this since quite a bit of time, and we believe it is totally doable.
But, we believe we need WebAssembly + Wasm-Threads. Without threads it will be a very difficult task to port VLC: you notably need one thread for input, one for audio output, one for video output, and one for the playlist.
However, threads are supposed to come in an update to WebAssembly. So that will be cool.
Also, VLC is very very heavily modularized, and Wasm will bring modules.
Finally, porting all codecs will be hard (FFmpeg), but VLC has some very simple modules for a few codecs, like theora and mpeg2, so it is easy to start and port one module after another...
By way of update from the standards trenches: threads are indeed a part of the WebAssembly plan, and we're also actively working on standardizing the SharedArrayBuffer API for JavaScript, which would allow an asm.js implementation with threads (either as a direct compilation target or as a WebAssembly polyfill). Shu-yu Guo has been working on drafting the SharedArrayBuffer spec and is planning to present an update in the next standards meeting in a couple weeks.
The two strengths provided by the model are sequentially consistent atomics and something between the strengths of C++'s non-atomics and relaxed atomics. Races are fully defined, and there is no undefined behavior or undefined values.
I'm happy to discuss things more in a new thread or in private communication and would prefer to not derail this thread about VLC.
Yea. The idea is to get the broad codec/container/etc support that VLC has, not to supplant native rendering of modern codecs. We haven't actually written any code yet, but if you hand VLC.js an H.264 file I'm pretty sure it'll just create a <video> tag and then pretend like it did all the work.
Something that has always bugged me about VLC... The Traffic Cone icon. Whereas it's was a brilliant piece of imagery to separate VLC from other 3rd party players, nowadays that market place is way less competitive, so maybe now is the time to switch up the icon to something more formal?
I've started seeing VLC included on work-orientated desktop builds and the Cone icon sticks out like a sore thumb, and most lay-users have no idea what VLC is so assume it's a system utility (based on the icon).
Maybe a combo of a Video Player icon, with a smaller traffic cone on the side... ?
woa. i know its a joke drawing, but the cone you made looks like a wizards hat!
maybe the next icon should be an awesome orange striped wizard hat. for the end users, vlc plays all videos, magically ;)
> It is my belief that a Javascript (later WebAssembly) port of VLC, the VideoLan Player, will fundamentally change our relationship to a mass of materials and files out there, ones which are played, viewed, or accessed. Just like we had a lot of software locked away in static formats that required extensive steps to even view or understand, so too do we have formats beyond the “usual” that are also frozen into a multi-step process. Making these instantaneously function in the browser, all browsers, would be a revolution.
Something like VLC.js is anything but "[a] [s]imple [r]equest" :)
The web experiences some kind of rot anyway. As long as you can still access and play stuff you really want in at least one way (VLC), is the instantaneous part such a win?
It's a technical solution to a problem most people don't face. Are people really facing that many multi-media files that can't be played? Two bigger issues that come to mind: 1) stuff hosted on closed, proprietary sites that then close down 2) how to find stuff (especially with esoteric or unusual content). So IMO, VLC.js probably wouldn't "fundamentally change our relationship to a mass of materials and files out there".
I agree there is a delivery issue, but again I think it's findability and hosting - hence why youtube/soundcloud/spotify/netflix is a thing.
It's true, I've experienced this first hand as I've tried to build my own simple "Netflix" which hosts my media collection and makes it available to all my devices. And as it turns out the intersection of maximum usability vs compatibility is a simple Apache open directory of files.
The only issues I've faced with this setup is one of codec availability and bitrate:
Sometimes, I'll be in a situation where I only have a high-quality, high bitrate copy of some media on my server, and I'll be trying to access it over a low bandwidth connection. If I'm stuck in one place for a while, I can re-encode the media on the server, but I'm unable to begin streaming to my device until the new lower bitrate copy is finished encoding on the server side.
Other times, I'll be in a situation where I want to access a file, but I need it in a different codec, so again I'll have to re-encode it on the server and download once that's finished.
What would be amazing, and would solve both problems (I'm not sure this is possible) is some kind of real time, http-based transcoding proxy. So you give it a link to a media file somewhere on the web, and it returns an http link which is streamed and transcoded in real time and served back to the client, all over http (so existing http media streaming clients can use this as a drop in replacement).
If such a thing existed (or exists) I'll donate $100 right now.
I've used Plex before, and while it is fancy and has a nice user experience for it's use cases, I've found it somewhat limited.
It is fundamentally locked into itself. You need to either have a device which can run the web client, or a Plex app for the platform. And that works fine if you're doing this just for yourself (I've run a Plex server in the past).
But sharing media is painful (or at least, more painful than simply sending a link to someone), and the overhead of compatibility was always annoying. Additionally, if you don't like the way the Plex app works on your device, tough, that's the only UI there is. However, there are a tremendous number of apps and applications with UIs galore that all support HTTP streaming.
Now, the last time I fully investigated Plex was about a year ago, but since then I've used friends instances and it always seemed lackluster, or at least not as straightforward as bare-bones HTTP streaming.
On the other end of the flexibility + compatibility scale, if all you need is to watch The Wire on your iPhone/iPad/Apple TV v4 on the couch instead of at your computer, I've been using Infuse. Just turn on SMB on your computer and fave a few directories, then stream away. It'll even pull in metadata, cover art, descriptions and sort shows into seasons.
Alternative client/UI: check out PleXBMC, you can use any theme you like. That said, plex media player will provide a superior experience as it's using mpv.
I tried to find a link that would point to a definitive answer for you on this one, but I couldn't (well, short of the one in my media server). The best I could do was refer to the settings in my own player and my own experience. If the video requires transcoding, either due to network conditions or player capabilities, it can transcode up to 1080p at 20Mbps. If the video can be played directly by the player, it can support up to 4K in an MP4 file container encoded h.265 (all other 4K video is currently transcoded down to 1080p). Of course, this assumes that the server you're running it on can handle transcoding the file. It's pretty common for folks to setup Plex servers on an RPi or NAS device that's ARM based and those are rarely capable of re-encoding the video. I run on a dedicated Core i5 small form-factor box with 8GB of memory and I've successfully transcoded 4 streams for play-back without issues.
The transcoder is pretty intelligent. It starts up full-power until it reaches a comfortable buffer and then dials back to an acceptable framerate to maintain stable playback while not melting the CPU. It only transcodes what it must in order to get the video to play on a device. My Roku, for instance, can handle H.264 encoded video and AAC/AC3 audio, but it won't play them back if they're stored in a Matroska (MKV) container, so the transcoder leaves the video/audio streams alone and just converts the container format to MP4 for playback. On my TVs that aren't hooked up to 7.1 systems, it transcodes only the audio. I have a small handful of files that are HEVC encoded and have nothing that supports that, so the entire video/audio file is transcoded in those cases, but to my wife and family who enjoy Plex, they don't have to think about it. I can tell when it's transcoding vs. playing directly because it'll stick at 33% for a second instead of just playing the video instantly, but they're used to Netflix/Hulu/Amazon which all have mild buffering before playback so they have no idea that something's going on in the background to deliver their video.
It's a good solution as far as I'm concerned. At the end of the day, all of my videos play and they're a pretty good variety of .avi (with a mix of video/audio codecs), MKV/MP4 (almost all H.264/5) and a few one-offs (I haven't tried playing a few .asf files I have lying around but I wouldn't be terribly surprised if they worked -- it uses FFMPEG to do the transcoding so depending on how they compiled it, if FFMPEG handles it, I'd imagine Plex would, too).
I believe re-encoding is limited to 720p. You can still stream the source file. Although, if you have bizarre alien source files, re-encoding is necessary.
Try Subsonic. It's a media streaming platform with an HTTP frontend, and can do arbitrary transcoding of arbitrary files on the fly depending on network speed.
Did I mention that it has a free version which is perfectly usable, and probably has most of the features you want (IIRC, it's $10 a month for your own subdomain and support for the app, but you also download MadSonic, which has the features, but not the domain).
DLNA, like TVMobili or Serviio. Twonky server and family, too, does that, I think - never tried it. Now the DLNA part on top of it hides the underlying URL's but I'm sure with some digging around with Wireshark you can figure it out. Using a DLNA client is much easier anyway.
This sounds like a very different goal than the port of MAME.
MAME is emulating video games, a video game is a piece of software that cannot be easily converted into something web-playable. Therefore in order to make the games accessible you need an emulator.
VLC is a media player. It supports more formats than a web browser, but audio and video files can be converted. If you have a file in an arcane, outdated video or audio format and you want to make it more easily accessible just convert it to something that is playable in a browser.
I'm not saying that a web vlc has no value. Conversion always means some quality loss. There are some interactive features (like dvd menus) that are essentially like software and having access to them in a browser isn't simply possible by conversion. But it has certainly much less value than a web mame.
> VLC is a media player. It supports more formats than a web browser, but audio and video files can be converted. If you have a file in an arcane, outdated video or audio format and you want to make it more easily accessible just convert it to something that is playable in a browser.
This is what the Internet Archive does right now. Unfortunately, transcoding from one lossy format to another just results in more quality loss. Something like VLC.js would sidestep that problem, and remove the need to transcode everything (and keep multiple copies!) as a bonus. You even mention that yourself, so I'm not sure I understand your point here.
I'm not sure I like this idea. This means even less control over my media files. Not only do they get streamed, but also the software that views them. I'm very likely not going to be able to use this offline, or outside of the browser in other apps. It's likely going to be slower, unless someone adds one-off extensions to JS to allow access to hardware decoding.
What would I suggest otherwise? Don't laugh, but something like NPAPI or ActiveX (COM). We had all these great component models in the 2000s. We could put them in a lightweight VM or a container for security. Instead of creating a window, we could let them directly draw to a shared-memory texture, which would solve some usability problems. I believe we could solve all problems that these technologies had nowadays.
Fundamentally, something like this belongs into the OS, not the browser (or any application). You should be able to use every codec and widget from other applications. We had that around 2000 with COM and VB6: Drag the media player widget from a pallette, drop it on your form, write a line of code to load a video.
Also, nobody gets to decide what codecs you are allowed to use or not for "political" or strategic reasons. Can't play something exotic? Just download the best-rated codec from your OSes app store (remember, which would run in a sandbox). Creating content and want to make sure everybody can see it without installing? Do what you do already today, and use the greatest common codec shipped will all OSes.
Neither is a browser runtime. I would even say running code in a browser could be more dangerous, because you have a huge complexity (HTML+JS+Network Stack+3D Engine+kichen sink in modern browsers) and thus a large attack surface. Whereas if you built an isolation layer around a OS process (or bytecode), you can restrict the code to do only what it has to (render to audio and video buffers), and you have to audit the wrapper (sandbox/VM/runtime), which is much smaller than the whole browser.
The reason browsers are relatively safe now is that so many brilliant people are working really hard on them, not because they are inherently safe.
This, a thousand times. The mess that is Plex sucks, a web browser tool which could play media files without all the drama would be a game changer for those of us who don't consume media through iTunes.
Though today the battle spreads from codecs to adaptive streaming standards. Apple (as expected) wasn't eager to support MPEG-DASH / MSE. It's as if they just can't do without messing things up for everyone.
- That will be *awesome*! Think of all the cool things
we can do with this!
- Oh no, we're going to be parsing video formats in
JavaScript. :(
- Hmm, I wonder if this will actually be better for
security compared to a VLC browser plugin?
I guess we're not far from JavaScript: the one true language.
It's still not clear whether this is actually true or whether we'll just be writing javascript from C. It really depends on the quality of APIs available.
My impression of web assembly was that browsers would go "oh that looks like web assembly" and then bypass JavaScript and compile the whole thing to native code?
Even without patent issues, multiple codecs are a security, maintenance, and implementation quality nightmare that could only be sustained by the maniacs working on ffmpeg. And there's no use case, because almost everything only has historical value.
You need multiple implementations of something for it to be a standard, and just try reimplementing SSA subtitles in Matroska and getting any of it right.
> Even without patent issues, multiple codecs are a security, maintenance, and implementation quality nightmare that could only be sustained by the maniacs working on ffmpeg.
Only as insecure as your browser's intersection with the video codec.
> And there's no use case, because almost everything only has historical value.
You talk as if historical value is low. I'd argue it's higher than supporting 4k or whatever's used to sell TVs these days. Not much point if we just throw out our existing content!
> You need multiple implementations of something for it to be a standard, and just try reimplementing SSA subtitles in Matroska and getting any of it right.
It's not about standardness or getting it right, it's about being able to play my video file on any level. Really, anything other than current behavior would be hugely preferable. iOS has more support for autoplaying me ads without my asking than it does for my own damn home movies.
> You talk as if historical value is low. I'd argue it's higher than supporting 4k or whatever's used to sell TVs these days. Not much point if we just throw out our existing content!
Well, commercial content can be re-encoded for new formats and usually is. Playing old files is limited to physical purchases of media, home videos, retro video games and internet porn.
Sure. This is why Apple doesn't have the consumers best interest at heart: they prioritize business over building tools that function well with the assets people already have. But you didn't buy those, right, so they don't have value.
The root cause is/and continues to be that we want HTTP to be an application protocol, which it was never intended to be or designed to be. I feel that until browsers behave more like independent operating systems / vms, this kind of tension between what we want to do and what we can do will always exists on the web as we know it...
I mean, Canvas, WebGL, Audio APIs, Video APIs, VR APIs, etc, when do we realize that what we want is a way to run applications...
Specifically, one thing they mention is licensing with regard to browsers.
Parent's comment has relevance IMO because we're trying to shoe horn things to make everything work in the browser and thus via HTTP as if the browser was an operating system. The browser was built to browse web pages, and at that, didn't even necessarily need to be graphical (see lynx). It's becoming a monster and many of us don't agree with the hand-waiving style justification of adding more and more.
Once upon a time, some people had a vision of using something optimized for files for files, another thing for chat optimized for chat, and so on. There were protocols for these, and apps on top of those. Not all these things were built well, efficient, and/or easy to use. The browser and http were good compromises in many ways, but certainly not without massive problems either. Fast forward and we've just compounded the problems and we keep working harder to lock ourselves into another limited set of technologies we already knew had fundamental issues (people used to/still do complain about processor architecture in similar ways).
I like the browser as a unifying, networked experience, but HTTP + the web browser are not really designed particularly well for some of the things we try to use them for from many perspectives. On top of that, with the issues of JavaScript, security, state, and other things, it gets even more messy. Somewhere along the way we seem to have become disinterested in protocols, lower-level networking, and generally challenging the norms.
Ironic that a lot of web developers preach things like KISS, and yet the browser is a huge example of a project where one could argue that KISS + don't throw in the kitchen sink has run amok. I understand this is not a popular opinion with some people, but I still think it is a valid one. Sometimes I think a little more pausing and thinking things through on lower-levels and with regard to the bigger picture are undervalued and could save us from some of the nonsense of the last few decades.
But before that, they used a VDT - as in like a VT100, 3270 or 5250 terminal. Everything is full circle - the web browser is just being treated as an intelligent terminal now.
I agree conceptually, but this points out some fun things.
So in other words, we tried that and failed in many ways in a much simpler domain. Now we are trying it again with a much bigger, more complicated scope and set of technical issues. Just like in the VTxxx days, people say something is compatible, and yet it is not 100%. Just like in those days, we deal with the craziness around all that as developers.
I've been working on some stuff dealing with old formats, including VT100. Diving into the source of some very big and well-known projects along with many lesser known things, it's amazing how wrong many implementations are when you check the standards. Related, I've seen the craziest implementations of telnet servers, ANSI SGR codes, and more. Translation: I have 0 faith in humanity to implement standards to the standards. I have less faith in the standards people to create sane standards.
What does HTTP have to do with canvas or anything else you listed? What does your misapprehension of what HTTP is meant to be have to do with the limited codec support in popular browsers? Do you just hate that the web stopped being static text 20 years ago?
For example, most (all?) browsers do not support the vastly upgraded H.265 yet. If we had ffmpeg, H.265 could've been supported in the browser over a year ago.
It's filled to the brim with heavily patented codecs. That's also why the browsers don't support H265. Note that Chromium and Firefox don't support H264 either exactly due to the patents. (Firefox will fall back to the system decoder, which, coincidentally, will be ffmpeg on a Linux system!)
If ffmpeg was a JS library, it wouldn't need to have anything to do with the browser vendors, it would be provided by the content host. The only concern is the binary size (and hey, that could be optimized by dropping codecs from the build that don't appear on the site).
The Media Source Extensions API (MSE) allows streaming audio/video playback in HTML. It's used widely today, including YouTube and Netflix. JavaScript libraries like Dailymotion's HLS.js can remux container formats on top of MSE, so you don't need native support for HLS in desktop browsers.
I used HLS (native in iOS and Edge) as well as hls.js and it works good if you use the correct codecs. A drawback is that through it's HTTP file based distribution it's not really low latency (live). A UDP based streaming protocol could for sure do better.
A UDP-bsaed streaming protocol for video would produce some crazy artifacts if packets are dropped or reordered, unless you put I-frames in very frequently (which would significantly increase the bandwidth requirement).
WebRTC is the HTML5 equivalent of RTSP (they are both based on RTP). You can use Janus to translate between the two - feed RTSP into Janus and it'll apply the necessary encryption and do the necessary firewall hole-punching to make it WebRTC compliant.
Raw playback of DVDs/BDs, that's what I'd like to see. This is something I miss on both the legal and the illegal streaming platforms - no way to get the "bonus content", or, especially in case of LotR, the different "editions" of a movie.
I think you'd use VLC's modular architecture to your advantage. Maybe you'd start with an Emscripten port, but then replace parts with a "native" JS implementation of tracker music support.
That's the first time I've heard of MediaSource being able to play from archives, though - do you have any more info?
MediaSource doesn't specifically support archives; it gives Javascript the more general ability to generate data to be presented by an <audio> or <video> element. How the Javascript code decides to generate that data is up to it -- be it loading chunks over HTTP, pulling them out of an archive, generating them from scratch...
And nearly every one of these does not do it correctly. I haven't recently checked this in VLC either since normally I just use the original trackers and other programs for playback on the real hardware.
From the github of the project you linked:
"None of the player classes fully implement all the
features and effects in each file format, but all the major
ones should be implemented. In addition, there most
certainly will be some playback bugs in each player class -
let me know if you run into some bad ones."
Additionally, I did not see support for some other tracker formats listed (though perhaps supported) such as IT. There seems to be some sort of culture of "cool web page" or "on github" means "works PERFECT" to many people. Worse, sometimes these things do work well, get abandoned, then something breaks them like a security change, so we're back to unsupported again.
As someone who has a passion for what some describe as "scene" formats, I have to say that 99% of the stuff I come across, whether for MOD, ANSI, RIP, DIZ, whatever is fundamentally broken or missing key functionality. The same holds true to other related tech such as vt100/vtxxx emulators. Music and ANSI are some of the worst offenders. Music is interesting since even at the time, different programs would play things back differently, however there was some degree of consistency to make things playable. Literally every ANSI parser in popular places I check have some combination of horrible bugs, missing parts of the standard, abysmal performance, inaccuracies, and worse. It's as if the authors didn't read the specs, because, well, they probably did not (or at least didn't understand them).
The situation gets worse when you think about preservation. Some of the formats had multiple versions and iterations, not to mention were built with different processors and architecture in mind. I was working with one format recently that used a little-endian packed struct that added fields between versions and stored this data as binary. If you read a previous version of the format with the most of the code out there (supporting the latest), it would actually crash. I've actually hit this with archives too, such as with LZH/LHA that are especially used in retro settings or on some platforms.
There's a lot of complexity to older formats and the evolution of things that appeared in more wild circumstances and/or times. Implementations that work perfectly are a rarity and worthy of attention. VLC in this regard does a better job than most. So I think part of the logic whether I agree or disagree is that VLC is widely used, maintained, and active, so it's a bit better for preservation at the moment than random collections of github libraries.
For ios I can really recommend modizer: http://yoyofr.blogspot.se/p/modizer.html
Otherwise the only good generic mod player i've used was deliplayer, unfortunately abandonware.
This would definitely be cool, but I'd be worried about using this because of mobile devices. It can't be good for battery life to be processing video through JavaScript. I know ios and many android devices have hardware assisted h.264 decoding that's very efficient to make video playback work well with great battery life.
The web plugin requires installation and draws a native window with native code, onto your browser viewport.
"VLC.js" would theoretically take an HTML <video> or <canvas> tag, as well as a filename, as arguments... and that'd be it. No setup, no nothing. You would just have video, playing.
Although I've had some bad experiences with software decoding on mobile platforms, so I don't really know of any advantages to having it as a JS file besides "look at what's possible with emscripten". It'd wreck your laptop's battery life, probably be unplayable on mobile. Why not just have your user... download VLC?
Question: does VLC handle much more formats than ffmpeg? I imagine the latter would be much simpler to compile (so that it'd convert Any Format™ to Some Common Modern Format Supported By All Browsers™)
> There is a lot more than just codecs and formats in a media player. :D
As you are the VLC lead, I understand where you are coming from. Nevertheless, depending on the perspective, a media player can be legitimately viewed as codecs+formats+some glue.
For example, ffplay that is shipped with FFmpeg has < 5k lines of code, and although not as full featured as VLC, definitely serves for basic media playback needs.
Or if that is too extreme, per the Open Hub stats, mpv does media playback with ~ 100k lines of code, once again in < 1/4 of the lines of FFmpeg (even with the hardcoded lookup tables ignored).
> As you are the VLC lead, I understand where you are coming from. Nevertheless, depending on the perspective, a media player can be legitimately viewed as codecs+formats+some glue.
The glue is quite large.
> For example, ffplay that is shipped with FFmpeg has < 5k lines of code
I hope you are joking. Look at the performance of ffplay...
Then how is that mpv gets the "glue" work done with significantly fewer lines than VLC, as I pointed out?
> I hope you are joking. Look at the performance of ffplay...
I am obviously not joking here. I used (and still use) ffplay for a large chunk of media playback.
The only reason I do not is for the rare hardcoded sub need, which may get at least partially addressed in a future gsoc project.
The gsoc idea is on the wiki and has been discussed to some degree on the mailing list, so this is not a mere speculation.
A blanket statement about performance is also obviously false here; it really depends on the codec/config settings.
For example, I simply tested with the first link (a 2160p hevc video) at http://4ksamples.com/, to really put all three media players under their paces. This was on my laptop, a Core i7 2.4 GHz Haswell, with Intel graphics on Arch Linux, with default configuration for all of ffplay, mpv, and vlc.
Played essentially fine with mpv and ffplay, with minimal stutter. VLC was the only one on which playback actually crashed, with numerous "[hevc @ ...] Could not find ref with POC ..." messsages, stalls, and display artifacts rendering playback useless along the way before the crash at ~ 00:30.
I tend to agree with you on this, personally. I can't speak to the code quality or whether or not VLC is a mess (I've never really looked) but I think the point here is the ability to play back (almost) any video/audio/subtitle format around which is met by both VLC and FFMPEG quite well. Being able to do that in a browser, without plug-ins, in a performant manner is important from an archival perspective (broadest audience support for the broadest array of file kinds/types).
Unfortunately, the ultimate goal -- taking the control of "what codecs are supported for streaming" away from the browser vendors probably won't be met. The biggest issue there is supporting reliable playback across a variety of processors with performance/battery constraints. For desktop users it might be a doable thing, but unlikely for mobile. Granted, we're doing things on our mobile devices that would have been positively unimaginable 7 years ago, so who knows?
At the end of the day, though, it's codec support that is the big win from my perspective -- lavc/lavf or FFMPEG -- not necessarily getting all of VLC ported.
> VLC itself is a bit of a mess, the UI is clunky and and isn't a great media player experience.
If it's that bad, why so many people use it? Sorry, but your comment is harsh, unexplained and very subjective... Especially coming from someone working for Apple.
I'm currently putting what little few hours I have into a few side projects that touch things like XM/MOD and other things like ANSI, RIP, etc. I always seem to share Jason's nostalgia and I get his meaning, but I often don't see eye to eye with him on a lot, whether his treatment of BBSs or some things like emulation.
There are so many red flags here, from licensing to browser vendors to JavaScript to threading to security, I don't know where to begin. It's not that these things aren't doable (even stupid hacks like transcoding are readily doable), it is just that I do not agree with the notion that this must be in the browser for the sake of digital preservation or some sort of user experience.
I know this is a very unpopular opinion in some parts (perhaps HN), but why must everything be in JavaScript and in the web browser? I understand we want things to be cross-platform and network accessible, but I must say that the drift from native to the browser is becoming increasingly disheartening and inspiring constant wtf moments. I get that a lot of people use the web and some people even see it as the computer itself now, but we rarely pause to ask if this is a good thing. Moreover, we invest a lot of time and effort in things that aren't really worth it in the end. I wish in computer science there were more sanity checks and willingness to make progress, rather than to paint flat tires or more crudely, shine turds. The term sunk costs comes to mind.
I like the web and the browser as much as the next person for browsing web pages, but it is not an operating system. Stop making it do everything and anything; that's why I bought a computer not a web browser machine (yeah, I know chromeos). I did not buy my computer so I could stab a few of its cores in the face or make it feel like it is running several orders of magnitude under-clocked. I feel like this is what my browser is doing to me when I see <insert latest abuse> or visit a page displaying a few poorly aligned paragraphs of text resulting in more RAM usage than total RAM I had in my computer for several decades.
What are the benefits and do they really outweigh the costs? I understand that building binaries is full of issues, as is compilation, processor architecture, and so many other parts of the native experience. But the thing is we spend so much time on maintaining and improving our already dated and flawed OSs that at least generally run fast and sanely.
Why would we want to shoehorn yet more things we already do happily in our OS into the browser? Why move something from efficient environments to inefficient ones? Why have to lobby or wait on a standards committee to allow something? Why wait on browser vendors to agree or disagree to make it more complicated? Why am I trading taskbars, terminal sessions, or other OS constructs for browser tabs (can we have new or other ways to mix content types)? How is the browser better as a unifying experience than my OS? Do I really even want videos on web pages or do I want something that is built better for video content? Why not just use what's inside my computer, like that expensive GPU, CPU, RAM, etc. without some ridiculous workarounds or barriers? These are all just random topics to think about, however the point is that it has to be more than, "Wow, that was cool but stupid" moments or "it's in the browser."
We already have put herculean efforts into making safe, pleasant, efficient, cross-platform languages. There are many successes and failures, and the jury is almost universally still out. But the barriers there are much smaller and the environments much more sane. In light of the move to multi-core architectures and bumping up against x86 limitations/physics, and eventually the same for others, it seems insane to me to throw away what we do have right now and relegate it to some browser environment that often struggles to things I can easily do faster/smoother over a terminal session.
I feel with every new "XYZ, but in the browser" we are taking a step away from both the Internet and the general utility of the computer and stomping on what remains. Instead, if we want experiences that must use the Internet, I would like to see more lower-level things implemented. I want to see the next browser, the next usenet, something new that can be built-on and other higher-level things for the every day user. One day it is mobile apps, the next day browser apps, and so on. Let's forget what is trendy or fun and get back to being creative and making progress, leaving the rest to fun "because" side-projects.
Finally, getting back to what Jason and/or archive does specifically, how does this help preservation? It seems to me that browser vendors and browsers in general don't have the best track record in general here and are an added layer of complication vs. <insert OS/programming language/VM/runtime/shim/emulator>. We've also seen any kind of ancillary/hosted/browser-native solution come and go, whether it was applets, flash, activex, whatever. I wish the browser were better with video, but VLC is not what would make it better, just designing and allowing the proper support from the ground-up would be the right way, and at worst, requires the help of the committees and the vendors. I do love VLC.
The Web is the only platform for mass-market applications that is not controlled by a single vendor. Not coincidentally, it has multiple independent implementations with significant market share that interoperate pretty well. It's the only platform that lets software vendors deliver applications (mostly) unhindered to all users, even users who are otherwise trapped in walled gardens.
So if you fear monopolies and want to see ongoing untrammeled innovation in mass-market applications, you want the Web to thrive.
..., but why must everything be in JavaScript and in the web browser?
In this case, the point is to make that archived data readily accessible. An internet archive where you need to track down and install the right native application to view each record is a bit like those rarefied museum collections locked away in the vast basements of large academic departments, accessible only to those few scholars who are familiar with the hoops they need to jump through to be allowed in.
I don't think the idea is that if you had, say, a large collection of MOD files you enjoy, that you should throw away your native tracker and use the browser with a JS tracker instead. Rather, the idea is that if there's a discussion somewhere of the early 90s Amiga demoscene you can post a link to a particular MOD file and know that everyone will be able to play it straight from there.
Agreed. An archive where the data is preserved but you have to really work to view things is good, but one where you can just browse is better. Since we don't have to worry about accidental damage caused by people who are just browsing, and people who are just browsing don't limit the ability of a researcher to use other tools, we can both.
Another good example is MIDI. As a browser of MIDI files I just want to hear them play. If I'm researching them, then I'll want to take the time to figure out what hardware and software synthesizers people were actually using when they composed them and listened to them.
Neither use-case impinges on the other, so there's no reason at all to avoid implementing an in-browser MIDI player. Or an in-browser media player that can play nearly everything, in this case.
On the other hand, downloading a 4gb dvd image full of mpeg2 streams is pretty wasteful of bandwidth; if it weren't for the menus and whatnot it'd be better to transcode it to something modern for playback.
> I know this is a very unpopular opinion in some parts (perhaps HN), but why must everything be in JavaScript and in the web browser?
> I wish in computer science there were more sanity checks and willingness to make progress, rather than to paint flat tires or more crudely, shine turds.
For the same reason you continue to use x86. It's deployed everywhere and that will not change.
I happen to think that wasm is pretty well designed, so I don't put much stock in assumptions along the lines of "it's in the browser, so it must be slow".
> It's deployed everywhere and that will not change.
Firstly, why do you assume I use x86? I spent a good chunk of last decade programming against PowerPC, Cell, and ARM in my day jobs. From what I am seeing WRT to x86, there is a small but increasing chance that it might change long-term (or needs to). But pulling your leg aside.....
Anything can, will, and does change. People seem to think the browser belongs to the people, but history has spoken to the contrary. The browser is in the hands of a few large companies, some more benevolent than others. There are also committees, and those are often stocked with people from companies as well. The average person has little say over the progress of the web itself, just like they have little say over the OS. We have literally said what you said about <pick any technology> since the 60s.
I'd argue that the browser is not in fact deployed everywhere and does change with great friction. The browser has a terrible record with regard to universal compatibility and deployment. Every time they improve, they get worse again in some other way. Go look at MDN and the cross-browser tables they list for certain features like video. Further, browsers have a long record of making dramatic changes, and APIs with new or breaking changes. There are also a long list of technologies with regard specifically to media experiences that are effectively no longer around or won't be long-term: ActiveX, Applets, Flash, Silverlight, VRML, the list goes on.
Additionally, tons of people are still using ancient browsers, and probably more on at least out-dated versions which introduces lots of inconsistencies, incompatibilities, security holes, and performance issues among other things. There are just as much if not more problems in this regard with compatibility vs. popular operating systems. Writing compatibility with older apps/standards/formats for an operating system is also miles easier as worst case, ancient stuff can be run pretty readily via emulation or simply ported. Obviously it is by no means easy, but generally time and individual efforts are the main obstacles rather than corporations.
There are things about the browser I like, for example that it allows content to easily be network accessible (in conjunction with http/servers). But we have an entire Internet and huge set of people to come up with other solutions too. I'm not saying the browser can't do many things, of course it can, but rather it is a sinking ship that we pile more baggage on and we refuse to stop. While speed is a concern, I've been more bothered by things such as development experience, access to hardware, security, threading (VLC author directly mentions this), rendering, graphics API, and many other things. If the browser is just going to be a frame around something that could run more directly in the OS, I am not sure if I see the point long-term. People make fun of Emacs for far less.
I obviously understand the needs of the archive, but with all due respect I think they overestimate the range of their audience is when it comes with regard to browsing MODs and such (i.e. not your mom/pop 99% of the time). This is a bigger discussion. I could just as easily pull something down from a web server via a native app like Kodi for arguably a better experience, though I am not suggesting that as a solution. What I am saying is I am disappointed by the lack of creativity and the pigeonholing of the internet as a browser-only experience, when as of a few years ago, 1960s/70s/80s problems such as aligning text reliably was seemingly out of the grasp of the average website. "This time we got it right" is not a good attitude either and rarely, well, right. The browser track record is not a good one and saying "a lot of people use it" has never been a good excuse for much of anything and hinders progress. A lot of people supported a lot of bad things throughout human history, but that doesn't change the fact they are bad. I get pragmatism, but for something like an archive I think we can do better and have more time to do it. Write what you can today in JS if you must, quickly and get some experience to the users, but don't move mountains to do it. If the effort to pigeonhole something takes as long as just developing a more correct solution, what is the point? There is also a larger issue with is more upstream to the issue of where it is viewed and involves things more on the level of how they are browsed, searched, preserved, and served to the end-user; the browser is just one end-point of that.
With regard to WASM, I don't care if it is fast or slow. I am more concerned with the development experience from feel to debugging to maintenance. Having worked with multiple architectures and more direct experiences, I am not filled with confidence when attempting more complex things. Even writing code for lets say Cell or 32 vs 64-bit architectures on your PC and then hoping it works is not without its flaws. If anything, we have proven we are really bad at "write here deploy there" style development as computer scientists, even if the write-side is compatible or the same as the target. But again, the problem isn't so much WASM in of itself, it is the ecosystem it has to exist inside and thus be limited by.
I'd answer the rest of these replies, but my main, much better written response was lost between writing it, being rate limited, and posting it. I zoned out and shut down. Neither OS nor browser helped :)
mame as javascript only worked because distribution of mame is garbage.
go ahead. dont belive me? go on and try to install it now. you will waste days just trying to figure out where you can find some version and then how to hook up that nice frontend you saw on some video demo of a rare mame setup mame.
what they did was excellent. because the alternative was messy.
vlc is already available in a very nice package everywhere. Insead of that futile effort, just convince everyone to just publish the media file and maybe subtitles, in a stardard way, instead of in a proprietary flash-player replacement or proprietary stream app. Your browser will already send it to vlc just fine.
But, we believe we need WebAssembly + Wasm-Threads. Without threads it will be a very difficult task to port VLC: you notably need one thread for input, one for audio output, one for video output, and one for the playlist.
However, threads are supposed to come in an update to WebAssembly. So that will be cool.
Also, VLC is very very heavily modularized, and Wasm will bring modules.
Finally, porting all codecs will be hard (FFmpeg), but VLC has some very simple modules for a few codecs, like theora and mpeg2, so it is easy to start and port one module after another...