Even without patent issues, multiple codecs are a security, maintenance, and implementation quality nightmare that could only be sustained by the maniacs working on ffmpeg. And there's no use case, because almost everything only has historical value.
You need multiple implementations of something for it to be a standard, and just try reimplementing SSA subtitles in Matroska and getting any of it right.
> Even without patent issues, multiple codecs are a security, maintenance, and implementation quality nightmare that could only be sustained by the maniacs working on ffmpeg.
Only as insecure as your browser's intersection with the video codec.
> And there's no use case, because almost everything only has historical value.
You talk as if historical value is low. I'd argue it's higher than supporting 4k or whatever's used to sell TVs these days. Not much point if we just throw out our existing content!
> You need multiple implementations of something for it to be a standard, and just try reimplementing SSA subtitles in Matroska and getting any of it right.
It's not about standardness or getting it right, it's about being able to play my video file on any level. Really, anything other than current behavior would be hugely preferable. iOS has more support for autoplaying me ads without my asking than it does for my own damn home movies.
> You talk as if historical value is low. I'd argue it's higher than supporting 4k or whatever's used to sell TVs these days. Not much point if we just throw out our existing content!
Well, commercial content can be re-encoded for new formats and usually is. Playing old files is limited to physical purchases of media, home videos, retro video games and internet porn.
Sure. This is why Apple doesn't have the consumers best interest at heart: they prioritize business over building tools that function well with the assets people already have. But you didn't buy those, right, so they don't have value.
The root cause is/and continues to be that we want HTTP to be an application protocol, which it was never intended to be or designed to be. I feel that until browsers behave more like independent operating systems / vms, this kind of tension between what we want to do and what we can do will always exists on the web as we know it...
I mean, Canvas, WebGL, Audio APIs, Video APIs, VR APIs, etc, when do we realize that what we want is a way to run applications...
Specifically, one thing they mention is licensing with regard to browsers.
Parent's comment has relevance IMO because we're trying to shoe horn things to make everything work in the browser and thus via HTTP as if the browser was an operating system. The browser was built to browse web pages, and at that, didn't even necessarily need to be graphical (see lynx). It's becoming a monster and many of us don't agree with the hand-waiving style justification of adding more and more.
Once upon a time, some people had a vision of using something optimized for files for files, another thing for chat optimized for chat, and so on. There were protocols for these, and apps on top of those. Not all these things were built well, efficient, and/or easy to use. The browser and http were good compromises in many ways, but certainly not without massive problems either. Fast forward and we've just compounded the problems and we keep working harder to lock ourselves into another limited set of technologies we already knew had fundamental issues (people used to/still do complain about processor architecture in similar ways).
I like the browser as a unifying, networked experience, but HTTP + the web browser are not really designed particularly well for some of the things we try to use them for from many perspectives. On top of that, with the issues of JavaScript, security, state, and other things, it gets even more messy. Somewhere along the way we seem to have become disinterested in protocols, lower-level networking, and generally challenging the norms.
Ironic that a lot of web developers preach things like KISS, and yet the browser is a huge example of a project where one could argue that KISS + don't throw in the kitchen sink has run amok. I understand this is not a popular opinion with some people, but I still think it is a valid one. Sometimes I think a little more pausing and thinking things through on lower-levels and with regard to the bigger picture are undervalued and could save us from some of the nonsense of the last few decades.
But before that, they used a VDT - as in like a VT100, 3270 or 5250 terminal. Everything is full circle - the web browser is just being treated as an intelligent terminal now.
I agree conceptually, but this points out some fun things.
So in other words, we tried that and failed in many ways in a much simpler domain. Now we are trying it again with a much bigger, more complicated scope and set of technical issues. Just like in the VTxxx days, people say something is compatible, and yet it is not 100%. Just like in those days, we deal with the craziness around all that as developers.
I've been working on some stuff dealing with old formats, including VT100. Diving into the source of some very big and well-known projects along with many lesser known things, it's amazing how wrong many implementations are when you check the standards. Related, I've seen the craziest implementations of telnet servers, ANSI SGR codes, and more. Translation: I have 0 faith in humanity to implement standards to the standards. I have less faith in the standards people to create sane standards.
What does HTTP have to do with canvas or anything else you listed? What does your misapprehension of what HTTP is meant to be have to do with the limited codec support in popular browsers? Do you just hate that the web stopped being static text 20 years ago?
For example, most (all?) browsers do not support the vastly upgraded H.265 yet. If we had ffmpeg, H.265 could've been supported in the browser over a year ago.
It's filled to the brim with heavily patented codecs. That's also why the browsers don't support H265. Note that Chromium and Firefox don't support H264 either exactly due to the patents. (Firefox will fall back to the system decoder, which, coincidentally, will be ffmpeg on a Linux system!)
If ffmpeg was a JS library, it wouldn't need to have anything to do with the browser vendors, it would be provided by the content host. The only concern is the binary size (and hey, that could be optimized by dropping codecs from the build that don't appear on the site).