tag:blogger.com,1999:blog-73296012024-03-13T21:25:29.913+01:00robUx4My thoughts for anyone to know (as if you cared).robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.comBlogger202125tag:blogger.com,1999:blog-7329601.post-4070463515781144642020-09-14T08:47:00.001+02:002020-09-14T08:47:49.803+02:00DXVA AV1 decoding in VLC<p>As AV1, <a href="https://aomedia.org/av1/">the royalty free video codec</a>, is getting some traction, hardware decoding is finally arriving. On PC the new <a href="https://www.nvidia.com/en-us/geforce/news/rtx-30-series-av1-decoding/">NVIDIA RTX 30</a> GPUs and <a href="https://newsroom.intel.com/news-releases/11th-gen-tiger-lake-evo/">Intel Tiger Lake/Xe GPUs</a> can support hardware decoding of 8-bit and 10-bit 4:2:0 sources (the most common formats for online videos). That's sufficient to decode 4K HDR content with moderate power usage.<br /><br /><a href="https://www.videolan.org/projects/dav1d.html">dav1d</a>, the AV1 software decoder, has been adopted by almost everyone. It's the <a href="https://medium.com/@ewoutterhoeven/av1-is-ready-for-prime-time-part-2-decoding-performance-d3428221313">fastest on all CPU platforms</a>. So it was only natural to add hardware decoder in there to make it even faster when the hardware can help. With the help of some <a href="https://code.videolan.org/mwozniak/dav1d/-/tree/dxva">test code</a> by one of the <a href="https://www.microsoft.com/en-us/download/details.aspx?id=101577">DXVA AV1 spec</a> author, after some fixing tweaking, testing, <a href="https://code.videolan.org/robUx4/vlc/-/tree/dav1d-dxva/6">wrapping in VLC</a> I managed to get AV1 decoding on the NVIDIA and Intel (test) hardware on Windows 10.<br /><br />Since the DXVA AV1 spec is still marked as subject to changes and Microsoft didn't add the relevant structures in its latest <a href="https://developer.microsoft.com/en-US/windows/downloads/windows-10-sdk/">Windows 10.0.19041.0 SDK</a>, the code is still in alpha status and not merged in dav1d/VLC (and wine/mingw64 which are the toolchains VLC uses).<br /><br />If you have any of these hardware and want to test AV1 decoding you can download <a href="https://people.videolan.org/~jb/Builds/DxVA_AV1/vlc-4.0.0-dav1d-dxva-win64.exe">this special VLC 4.0 build</a>. Only Windows 7/8/10 64 bits is supported. It is signed by <a href="https://www.videolan.org/">VideoLAN</a> to make sure it's legit.<br /><br />You also get a glimpse of the new VLC 4.0 refeshed UI and medialibrary as a bonus.</p>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com1tag:blogger.com,1999:blog-7329601.post-70783211604325290702019-12-08T18:46:00.002+01:002019-12-09T08:32:58.475+01:00Happy Birthday MatroskaOn Friday <a href="http://matroska.org/" target="_blank">Matroska</a> turned 17. There was a celebration at the <a href="https://mediaarea.net/NoTimeToWait4" target="_blank">No Time To Wait 4</a> conference in Budapest with some nice cake with the MKV initials. I wanted to make a small speech for the occasion but didn't find the opportunity (and then I had a train back to France to catch). So here's a few of the things I would have said and more.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://twitter.com/nttwconf/status/1202972801818386432" target="_blank"><img alt="NTTW 4 MKV17 Cake" border="0" data-original-height="340" data-original-width="586" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIZ1had7QEh-FfcAJvJQxmG8u6-Dm9E81crSwwnSispOB-jDRF4JXX12rifqcv7HH28BwIfYTuD-EV8c-GJbZaLl2pN_yTASoxfqh8Vo0NFa89xTM-jDHQaC6IGXUtXbnR5Rc/s1600/cake.jpg" /></a></div>
<br />
<br />
It was nice to be back at No Time To Wait. It's such a fantastic gathering of archivists and developers and the atmosphere is always great. It's nice to feel welcome although I am not an archivist and know little of all the things they have to go through in their job. Although every time after NTTW I feel like I know a bit more and thus can help more.<br />
<br />
The work they do with Matroska (combined with FFV1) is exactly one of the main use Matroska was designed for. Except neither me nor <a href="https://robux4.blogspot.com/2017/11/the-many-fathers-of-matroska.html" target="_blank">the other creators of Matroska</a> ever dreamed it would ever be used professionally for prestigious archival like at the BFI (British Film Institute) and possibly at the Library of Congress in the USA. Of course there was already traction in the "corporate" world as MKV is used for all kinds of video sharing as files and also the basis of WebM. It's officially supported in OSes like Windows 10 or Android. But the archival world is a different thing. It's not only about sharing the latest movie//TV show you got, but it's keeping important content for a long time. IMO it has a deeper impact in the long term and some historical, political and artistic value that can't be beaten. This is also what makes it so special to me as I always try to find an extra bit of "soul" in whatever I do. It's not just about writing some code or documentation. It's also a great motivation knowing it will benefit something bigger.<br />
<br />
It's amazing what a hobby project started (forked) 17 years ago has become. In my mind (and probably all other involved) we were driven by the same spirit that was on the Internet during that time. Creating something great for free and possibly challenge the "corporate" world. One of the U in my robUx4 nickname stands for Utopia. That was always part of the goal.<br />
<br />
Just talking about why some details of Matroska and how we came to the conclusion of that detail always remind me of the amount of work we put in this, all the challenges we faced (like trying to be as good as ogg for streaming). It's fun having to go back to all these memories when we're doing the IETF specifications and realize the things we got right as complete n00bs and some we got wrong (nanoseconds non rationale timestamps that we called timecodes, to show how n00b we were, the clock was even in floating point because in the analog world clocks aren't perfect). Matroska was designed to last 10 years in a time were there was a new video codec every 3 months. It was still hard to predict the full evolution of things (no VR for example). The challenges posed by long term archival is also interresting. Here we have to support all kinds of sources (analog and digital) with very specific characteristics (and keep everything as does RAWcooked thanks to attachments). There's still plenty of areas to improve (Timecodes, Bayer support). After 17 years, Matroska still has room to grow.<br />
<br />
I am very grateful to have meet all kinds of new people who care deeply about the work we do on Matroska and see again familiar faces who care at least as much. You have no idea how great is to have all of your support and that you took a leap of faith choosing Matroska when we just started making proper specifications. From NTTW1 it was not very clear whether people would actually use Matroska at all.<br />
<br />
I would like to thank in particular the organizers of the event: Dave, Jerome, Ashley, Alessandra and Zsuzsa. It's a privilege to be welcome in your community.<br />
<br />
👓👓<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://showyourstripes.info/" target="_blank"><img alt="Budapest ShowYourStripes" border="0" data-original-height="800" data-original-width="1600" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiv-8ooaBuquVyIZugaCEQabnxr8-VW_dCbulMMwTgiIuQ9QSgQ39HXwyei2_noxvGDSXwP2xLn-4nZd8POSkz2C2kxlEpmHIF68nmh_ixeKfG3auWTnIeVglKRlE7AHINX4gI/s400/_stripes_EUROPE-Hungary--1901-2018-BK.png" width="400" /></a></div>
robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-55914599316678940052019-03-10T17:00:00.000+01:002019-03-10T17:00:53.179+01:00Windows Video Playback PerformanceAs a VLC developer I have spent a lot of time working on video decoding and displaying for Windows, especially with Direct3D11. VLC 3.0 is the result of that work and we keep on improving it.<br />
<br />
Despite that, people are still complaining about the performance of VLC compared to other Windows players. I know we did a good job, so I wanted to know where we're at regarding the performance.<br />
<br />
I tested the following software with various source files on Windows 64:<br />
<ul>
<li><a href="https://ftp.free.org/mirrors/videolan/vlc/3.0.6/win64/vlc-3.0.6-win64.exe" target="_blank">VLC 3.0.6</a></li>
<li>Movies & TV (10.18102.12011.0 preinstalled on Windows)</li>
<li><a href="https://binaries.mpc-hc.org/MPC%20HomeCinema%20-%20x64/MPC-HC_v1.7.13_x64/MPC-HC.1.7.13.x64.exe" target="_blank">MPC-HC 1.7.13</a></li>
<li><a href="https://freefr.dl.sourceforge.net/project/mpcbe/MPC-BE/Nightly%20Builds%20%28from%20svn%20trunk%29/1.5.3%20%28build%204322%29%20beta/MPC-BE.1.5.3.4322.x64-installer.zip" target="_blank">MPC-BE 1.5.3.4322</a></li>
<li><a href="https://sourceforge.net/projects/mpv-player-windows/files/64bit/mpv-x86_64-20190210-git-f2e7e81.7z/download" target="_blank">MPV 20190210 </a></li>
<li><a href="https://kodi.tv/download/849" target="_blank">Kodi v18.0</a> </li>
</ul>
All players have been used with their default settings, fresh from installation. <br />
<br />
My system is an i7-8700 and I used the integrated Intel 630 GPU connected to a 2560x1440 display at 120Hz connected by DisplayPort. It has 12 logical threads at up to 4 GHz, so CPU decoding is supposedly OK.<br />
<br />
Here are the results for the various test files:<br />
<br />
<h2>
<a href="https://4kmedia.org/sony-camping-in-nature-4k-demo/" target="_blank">Sony Camp HEVC HDR 4K 60 fps</a></h2>
<br />
This is the main sample I used when working on HDR. It has a high bitrate that I have a hard time playing over my NAS and even locally it can stutter in the hardware decoder (we only found a fix for that recently). Apart from 8K and AV1 that's pretty much the hardest thing to decode right now. Not only that but the HDR content needs to be handled properly. In my case the screen is not HDR so tone mapping has to be applied in the player (the HDR mode of Windows is not enabled).<br />
<br />
<br />
<table>
<tbody>
<tr><th>Player</th><th>CPU %</th><th>Memory Usage</th><th>GPU 3D</th><th>GPU Decode</th><th>GPU Processor</th><th>Smooth Playback</th><th>Colours</th><th>Realtime</th></tr>
<tr><td>VLC</td><td>2</td><td><span style="color: red;">1840</span></td><td>40</td><td>40</td><td>0</td><td>yes</td><td>ok</td><td>yes</td></tr>
<tr><td>MPV</td><td><b><span style="color: red;">100</span></b></td><td>835</td><td><span style="color: orange;">65</span></td><td>0</td><td>0</td><td><span style="color: orange;"><span style="color: #b45f06;">stutters</span></span></td><td><span style="color: red;">dark</span></td><td>yes</td></tr>
<tr><td>MPC-BE</td><td>2</td><td>1250</td><td><span style="color: orange;">60</span></td><td>30</td><td><span style="color: orange;"><span style="color: #e69138;">41</span></span></td><td><span style="color: red;">no</span></td><td><span style="color: red;">too bright</span></td><td><span style="color: red;">no</span></td></tr>
<tr><td>MPC-HC</td><td>1</td><td>1054</td><td><span style="color: orange;">60</span></td><td>60</td><td><span style="color: orange;"><span style="color: #e69138;">53</span></span></td><td>yes</td><td><span style="color: red;">washed out</span></td><td>yes</td></tr>
<tr><td>Movies & TV</td><td>1</td><td>650</td><td><span style="color: orange;">60</span></td><td>40</td><td>0</td><td>yes</td><td><span style="color: red;">saturated</span></td><td>yes</td></tr>
<tr><td>Kodi</td><td>2</td><td>1045</td><td>40</td><td>40</td><td><span style="color: orange;"><span style="color: #e69138;">55</span></span></td><td>yes</td><td><span style="color: red;">washed out</span></td><td>yes</td></tr>
<tr><td>MPV Ctrl+H</td><td>2</td><td>932</td><td><span style="color: orange;"><span style="color: #e69138;">80</span></span></td><td>45</td><td>0</td><td>yes</td><td><span style="color: red;">dark</span></td><td>yes</td></tr>
</tbody></table>
<br />
The first thing noticeable is that by default MPV doesn't use the GPU to decode this file. You have to manually tell it to do it. The last line adds the performance of MPV with hardware decoding (d3d11va) on.<br />
<br />
The second thing is that apart from VLC, no player display the HDR colours/luminance correctly (there is a SDR version of the same file for comparison, but I don't know how official it is, I also compare to what my HDR TV does). It's surprising from MPV as the tone mapping in VLC is inspired by their code.<br />
<br />
The third thing is that MPC-BE cannot play this file in real time, even though the CPU and GPU are not maxed out. Maybe a buffering issue. The audio stops every few second and then playback resumes.<br />
<br />
The stuttering in MPV means the 60fps of the source is not respected. The frames are either skipped or not displayed at the right time (something we fixed in VLC after some hard work).<br />
<br />
<h2>
<a href="https://mega.nz/#!XI1yiKLA!rF4vweNo_xA7vpSzpLU-JctqfUGZN4vVU1m6WIJ5lT4" target="_blank">DNCE H264 1080i 29.97fps</a></h2>
(found on <a href="https://kodi.wiki/view/Samples">https://kodi.wiki/view/Samples</a>)<br />
<br />
This sample is more simple to decode but the interlacing still needs to be done. Either by the CPU or the GPU.<br />
<br />
<table>
<tbody>
<tr><th>Player</th><th>CPU %</th><th>Memory Usage</th><th>GPU 3D</th><th>GPU Decode</th><th>GPU Processor</th><th>Smooth Playback</th><th>Deinterlaced</th></tr>
<tr><td>VLC</td><td>2</td><td><span style="color: red;">399</span></td><td>30</td><td>12</td><td>18</td><td>yes</td><td>yes</td></tr>
<tr><td>MPV</td><td>7</td><td>133</td><td>7</td><td>0</td><td>0</td><td>yes</td><td><span style="color: red;">no</span></td></tr>
<tr><td>MPC-BE</td><td>1</td><td>306</td><td><span style="color: orange;"><span style="color: #e69138;">60</span></span></td><td>12</td><td><span style="color: orange;"><span style="color: #e69138;">30</span></span></td><td><span style="color: #3d85c6;">very</span></td><td>yes</td></tr>
<tr><td>MPC-HC</td><td>1</td><td>340</td><td>40</td><td>12</td><td><span style="color: orange;"><span style="color: #e69138;">30</span></span></td><td>yes</td><td>yes</td></tr>
<tr><td>Movies & TV</td><td>2</td><td>162</td><td><span style="color: lime;">0</span></td><td>11</td><td><span style="color: orange;"><span style="color: #e69138;">27</span></span></td><td>yes</td><td>yes</td></tr>
<tr><td>Kodi</td><td>2</td><td>307</td><td><span style="color: orange;"><span style="color: #e69138;">73</span></span></td><td>9</td><td><span style="color: orange;"><span style="color: #e69138;">28</span></span></td><td>yes</td><td>yes</td></tr>
<tr><td>MPV Ctrl+H / D</td><td>1</td><td>160</td><td>14</td><td>11</td><td><span style="color: #e69138;">35</span></td><td>yes</td><td>yes</td></tr>
</tbody></table>
<br />
The last column shouldn't be there, but by default MPV doesn't deinterlace the file. You have to press the d key to enable deinterlacing. The last line adds the performance of MPV with hardware decoding and deinterlacing on.<br />
<br />
MPC-BE seems to double the original frame rate by default and interpolate between frames (soap opera effect). It may be good for sport but this is not a sport sample...<br />
<br />
Movies & TV is impressing as it manages to display the content with 0% GPU 3D usage. It likely because they do all the processing in the Video Processing and nothing during display. That's an area we could improve in VLC.<br />
<br />
<h2>
<a href="http://distribution.bbb3d.renderfarming.net/video/mp4/bbb_sunflower_1080p_30fps_normal.mp4" target="_blank">Big Buck Bunny H264 1080p 30fps</a></h2>
<br />
This is the most common kind of file people are playing (apart from 720p files).<br />
<br />
<table>
<tbody>
<tr><th>Player</th><th>CPU %</th><th>Memory Usage</th><th>GPU 3D</th><th>GPU Decode</th><th>GPU Processor</th></tr>
<tr><td>VLC</td><td>1</td><td><span style="color: red;">418</span></td><td>9</td><td>9</td><td>0</td></tr>
<tr><td>MPV</td><td>3</td><td>148</td><td>8</td><td>0</td><td>0</td></tr>
<tr><td>MPC-BE</td><td>0</td><td>280</td><td>30</td><td>10</td><td><span style="color: orange;"><span style="color: #e69138;">13</span></span></td></tr>
<tr><td>MPC-HC</td><td>1</td><td>265</td><td>30</td><td>10</td><td><span style="color: orange;"><span style="color: #e69138;">12</span></span></td></tr>
<tr><td>Movies & TV</td><td>1</td><td>95</td><td><span style="color: lime;">0</span></td><td>9</td><td>0</td></tr>
<tr><td>Kodi</td><td>2</td><td>235</td><td><span style="color: red;">70</span></td><td>6</td><td><span style="color: orange;"><span style="color: #e69138;">25</span></span></td></tr>
<tr><td>MPV Ctrl+H</td><td>0</td><td>104</td><td>8</td><td>10</td><td>0</td></tr>
</tbody></table>
<br />
As expected the CPU usage is negligeable. The DirectShow based players seems to take a lot of GPU to display this simple file. And Kodi even more, even though it's using less GPU to decode. Not sure why they need some GPU processing here, maybe color conversion which VLC does in the shader. That would explain the extra GPU processor for the 1080i sample as well.<br />
<br />
<h2>
<a href="https://streams.videolan.org/issues/19196/sample.webm" target="_blank">Freedom '90 Music Video Outtakes VP9 1080p</a></h2>
(from <a href="https://www.youtube.com/watch?v=uh3aU63f8U8" target="_blank">YouTube</a>)<br />
<br />
If you watch a lot of YouTube there's a chance you might be decoding VP9 so I tested that as well. This is decoded by the GPU.<br />
<br />
<table>
<tbody>
<tr><th>Player</th><th>CPU %</th><th>Memory Usage</th><th>GPU 3D</th><th>GPU Decode</th><th>GPU Processor</th><th>Picture Quality</th></tr>
<tr><td>VLC</td><td>1</td><td><span style="color: red;">196</span></td><td>8</td><td>6</td><td>0</td><td>normal</td></tr>
<tr><td>MPV</td><td>1</td><td>100</td><td>6</td><td>0</td><td>0</td><td>normal</td></tr>
<tr><td>MPC-BE</td><td>0</td><td>215</td><td>40</td><td>6</td><td>10</td><td><span style="color: red;">macroblocks</span></td></tr>
<tr><td>MPC-HC</td><td>2</td><td>183</td><td>30</td><td>0</td><td>13</td><td><span style="color: red;">macroblocks</span></td></tr>
<tr><td>Movies & TV</td><td>0</td><td>76</td><td><span style="color: lime;">1</span></td><td>6</td><td><span style="color: orange;">7</span></td><td>normal</td></tr>
<tr><td>Kodi</td><td>3</td><td>280</td><td>70</td><td>5</td><td>25</td><td><span style="color: red;">macroblocks</span></td></tr>
<tr><td>MPV Ctrl+H</td><td>1</td><td>66</td><td>6</td><td>6</td><td>0</td><td>normal</td></tr>
</tbody></table>
<br />
In this case MPC-HC, MPC-BE and Kodi show noticeable macroblocks that the other players don't have.<br />
<br />
<h2>
<a href="https://4kmedia.org/lg-tech-uhd-4k-demo/" target="_blank">LG 4K Tech Demo HEVC 60 fps </a></h2>
<br />
A more regular 4K file that has no HDR, so should have less to do in the GPU.<br />
<br />
<table>
<tbody>
<tr><th>Player</th><th>CPU %</th><th>Memory Usage</th><th>GPU 3D</th><th>GPU Decode</th><th>GPU Processor</th><th>Smooth Playback</th><th>Realtime</th></tr>
<tr><td>VLC</td><td>4</td><td><span style="color: red;">1215</span></td><td>18</td><td>65</td><td>0</td><td>yes</td><td>yes</td></tr>
<tr><td>MPV</td><td><b><span style="color: red;">34</span></b></td><td>615</td><td>30</td><td>0</td><td>0</td><td>yes</td><td>yes</td></tr>
<tr><td>MPC-BE</td><td>2</td><td>840</td><td><span style="color: orange;">70</span></td><td>40</td><td><span style="color: orange;">60</span></td><td><span style="color: red;">no</span></td><td><span style="color: red;">no</span></td></tr>
<tr><td>MPC-HC</td><td>2</td><td>765</td><td><span style="color: orange;">60</span></td><td>50</td><td><span style="color: orange;">65</span></td><td>yes</td><td>yes</td></tr>
<tr><td>Movies & TV</td><td>1</td><td><span style="color: lime;">293</span></td><td>15</td><td>70</td><td>0</td><td>yes</td><td>yes</td></tr>
<tr><td>Kodi</td><td>3</td><td>485</td><td><span style="color: orange;">60</span></td><td>45</td><td><span style="color: orange;">55</span></td><td>yes</td><td>yes</td></tr>
<tr><td>MPV Ctrl+H</td><td><b><span style="color: red;">34</span></b></td><td>693</td><td><span style="color: orange;">30</span></td><td>0</td><td>0</td><td>yes</td><td>yes</td></tr>
</tbody></table>
<br />
As with the HDR sample, MPC-BE can't play this file in realtime. The audio stops once in a while.<br />
<br />
Despite the request to enable hardware decoding, MPV doesn't seem to be using it.<br />
<br />
Movies & TV does an impressive job of using little memory.<br />
<br />
<h2>
Conclusion</h2>
<br />
VLC seems to be the overall best player with Movies & TV for all this content. The main drawback of VLC is currently the memory usage. It's possible to decrease it by using <i>--avcodec-threads=1</i> but if you set this, you may have problems playing files your GPU can't decode.<br />
<br />
We are working on this memory consumption which should be reduced in all cases for VLC 4.0. <br />
<br />robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-85040523884500877022017-11-19T10:19:00.000+01:002017-11-19T13:51:45.944+01:00The Many Fathers of MatroskaI think I'm done giving talks about Matroska for this year. And one of the thing that bothers me each time (to the point I might <a href="https://www.youtube.com/watch?v=n1XYEVtxzU8&feature=youtu.be&t=6630" target="_blank">embarrass people unwillingly</a>, sorry <a href="https://twitter.com/kieranjol" target="_blank">Kieran O'Leary</a>) is that I take credit for all of Matroska although there were many people involved almost as much as me during its long birth. So I would like to set the record straight for posterity.
<br />
Also I say fathers because it was all men (or boys) involved. Only <a href="https://forum.doom9.org/member.php?u=21231" target="_blank">Liisachan</a> on Doom9 was involved in creating the original logo.
<br /><br />
<h3>Lasse Kärkkäinen (FI)</h3>
<br />
Lasse is the creator of MCF. The project that Matroska was forked from. Although forks are usually not a great idea, there was so many differences between his original format and how we turned it into what is now Matroska that we couldn't continue working on the same project. We agreed to disagree and went on separate ways. But there were no hard feelings, we met on a few occasions after that. He even asked me for a letter recommendation for a job in Finland once.
<br /><br />
<h3>Frank Klemm (DE)</h3>
<br />
One of the key difference between MCF and Matroska is the use of EBML. And one of the key feature of EBML is the way header values are coded in an UTF-8 like manner. This was Frank's idea. And it gave a great boost to the format and why going back to MCF was not possible after that.
<br />
Frank was one of the developper of MPC (Musepack) codec which combined lossless and lossy audio compression in the same format. People were so happy with his work that there was a crowdfunding (which didn't exist at the time) on Doom9 to buy him a new PC.
<br /><br />
<h3>Christian HJ Wiesner (DE)</h3>
<br />
Christian is not a developer. He's not really a technical guy either. But he liked so much what we were doing that he was organizing everything around the project. He was also the first to join me when I created the fork on Sourceforge. He's also the one who organized the crowdfunding for Frank Klemm and delivered him his PC. He also held the <a href="http://matroska.org/">matroska.org</a> domain safe for a long time which he then donated to the Matroska non-profit.
<br /><br />
<h3>John Cannon, Paul Bryson, Jory Stone (USA)</h3>
<br />
Apart from Frank and I they were the main input to make changes to MCF that ended up as Matroska. IIRC <a href="https://twitter.com/coquifrogs" target="_blank">John Cannon</a> was the one to suggest that the Matryoshka name were planning to use was too complicated for USAns and reduce it to Matroska.
<br /><br />
<h3>Alexander Noe (DE)</h3>
<br />
Alexander was also developing an AVI muxer and a Matroska muxer at the same time we created libebml/libmatroska. He gave a lot of input on the format and some refinement which helped a lot. He later turned into artificial intelligence, so I guess he's a millionaire now.
<br /><br />
<h3>Moritz Bunkus</h3>
<br />
Everyone who has dealt with Matroska has been using mkvtoolnix at some point. It's almost entirely done by <a href="https://twitter.com/MoritzBunkus" target="_blank">Moritz</a>. He joined the project a bit later after it was almost stable. At the time he was working on an OGM tool for Linux and got interrested in doing the same for Matroska. It became mkvmerge. Since then he has been the main maintainer of the Matroska libraries and the main Matroska tool. He's also part of the non-profit.
<br /><br />
<h3>Михаил "Haali" Мацнев (RU)</h3>
<br />
Mike create the famous Haali DirectShow demuxer based on his own C library. He also worked a lot on the Segment linking, even doing his own version that was easy to use with DirectShow (but not really clean standardwise). Most people have been playing Matroska files using his code for a long time.
<br /><br />
<h3>Ludovic Vialle / Dan Marlin (FR/US)</h3>
<br />
<a href="https://twitter.com/LudovicVialle" target="_blank">Ludovic</a> is the one that got me into this. I was looking for a container to replace AVI and MPEG PS and he pointed me in the MCF direction. He was working on his own DirectShow player at the time and later founded CoreCodec with Dan Marlin. Corecodec has helped a lot in the Matroska development, helping with the website and mailing lists hosting. At some point we also had our own web forum. They also worked a lot on cleaning the specs that are currently on matroska.org. I also worked there for many years and later with Ludovic's other company LevelUp Studios. Ludovic is also part of the Matroska non-profit.
<br />
<br />
For reference there's also a <a href="https://www.matroska.org/team.html" target="_blank">longer list of people involved</a> on our website. This list also contains a lot of people who helped develop the many softwares you might have used. It should be updated with all the people involved in CELLAR like <a href="https://twitter.com/dericed" target="_blank">Dave Rice</a>, <a href="https://twitter.com/ablwr" target="_blank">Ashley Blewer</a>, <a href="https://twitter.com/JeromeM78" target="_blank">Jerome Martinez</a>, <a href="https://twitter.com/RetoKromer" target="_blank">Reto Kromer</a>, Michael Bradshaw, Martin Below or Tobias Rapp.robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-86341575979171192902017-11-12T21:00:00.000+01:002017-11-12T21:00:52.986+01:00Still No Time To WaitThe second edition of <a href="https://mediaarea.net/MediaConch/notimetowait2.html" target="_blank">No Time To Wait</a> was a success. It's a conference
where archivists meet the developers of the software and formats they
use or might use.<br />
<br />
Since last year a lot has changed. We were
advocating people to use Matroska and FFV1 because they meet their needs
in a very good manner. This year we heard many stories of people who
actually did the move and are happy about it. <a href="https://twitter.com/RetoKromer" target="_blank">Reto Kromer</a> even made a
presentation explaining he actually does the conversation on the fly
when transferring between tapes.<br />
<br />
One presentation particularly
caught my attention: the look for the perfect player by Aghate Jarczyk (University Of The Arts, Bern).
Working daily on improving VLC that's certainly something we want to do
and make every user happy, even in a professional way not just for
casual file playback. It turns out many of the issues mentioned,
preventing the switch from QuickTime Pro 7, are already solved. Here's
the list:<br />
<ul>
<li>display metadata from the file. It's there with Ctrl+i
(Cmd+i on macOS I suppose) in the metadata tab. It's not at the
MediaInfo level but useful nonetheless. It's also refreshed during
playback so if you switch between formats midstream you can see it
there. It won't tell you if the data come from the codec or the
container, it's aggregated by the player. If you really need that
feature file an issue in Trac</li>
<li>the list of codecs used for
playback. It's also available in a tab when you do Ctrl+m (Cmd+m) and can be
refreshed during playback (for example with streams that have mixed
interlacing). It's probably more an issue with QuickTime Pro where there
might be plug-ins in the system you're not aware of. It's much less
likely with VLC. It doesn't load modules compiled for an older version
and usually doesn't have extra modules coming from third parties. </li>
<li>added black borders when opening a video. This is surprising as that's
not the behavior on Windows or the Qt interface in general. It may be a
mac version specific behavior or an option to use the "fit screen"
aspect ratio. A reset of the preferences should fix that. </li>
<li>Can we
display timecodes? It's technically possible, we decode them but they
are not frame accurate because of our internal clock design. To be
accurate it needs a redesign that we are going to do for VLC 4.0. And
that version will take less time to be done that it took to do 3.0.</li>
<li>To go back one frame at a time: it's possible to use a LUA script to do that, see: <a href="https://forum.videolan.org/viewtopic.php?p=462937#p462937" rel="nofollow" target="_blank">https://forum.videolan.org/viewtopic.php?p=462937#p462937</a></li>
<li><a href="https://twitter.com/emiliemagnin" target="_blank">Émilie Magnin</a> who hosted the Format Implementation panel also mentioned
the possibility of launching the player more than once at a time. It is
an option that's possible on Windows and Linux but apparently it takes a
little more work on macOS. You'll need an <a href="https://wiki.videolan.org/VLC_HowTo/Play_multiple_instances/" target="_blank">external AppleScript to do that</a>. </li>
</ul>
<br />
There
were a lot of talks about open source in general as well. Everyone is
pretty much sold on the idea now and how crucial it is for archivists
that they can rely on code that can be reused and tweak for decades. A
guaranteed no proper software can offer. An interesting twist is that
sometimes the software to play the content has to be archived as well.
Usually when using proprietary solutions that might (will) die over
time. Another good reason not to use that in the first place. <br />
<br />
Some
people are still not using <a href="https://twitter.com/MatroskaOrg" target="_blank">Matroska</a>. One of the reasons, which make
sense in their context, is that it's not (yet) a standard. That is
endorsed by a standards body you trust. As pointed out by <a href="https://twitter.com/The_BFOOL" target="_blank">Ethan Gates </a>
that level of trust may vary and totally arbitrary. For example some
still use AVI even though the specifications has never gone through any
of the common standards bodies (AFAIK) . This is on us, and particularly
me, to make the standardization of Matroska happen and finish the work
that is already on the way. The main issue being that we all do that on
our free time, so we may look for funding to be done sooner rather than
later. A crowdfunding was mentioned. We're going to discuss how we can
make this happen (suggestions welcome). That would be a first for
Matroska as we never received money for the project (apart from around
200$ of PayPal donations over 15 years).<br />
<br />
A big thanks to all the organizers and especially <a href="https://twitter.com/dericed" target="_blank">Dave Rice</a> and <a href="https://twitter.com/JeromeM78" target="_blank">Jerome Martinez</a> and to <a href="https://twitter.com/m_loebenstein" target="_blank">Michael Loebenstein</a> of the <a href="https://twitter.com/filmmuseumwien" target="_blank">Austrian Film Museum</a> for a great venue.<br />
My apologies to <a href="https://twitter.com/kieranjol" target="_blank">Kieran O'Leary</a>, I promised I'd bring the <a href="https://twitter.com/videolan" target="_blank">VLC hat</a> on the second day and then I forgot. robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-36937193332367733702017-11-05T19:41:00.000+01:002017-11-05T19:41:11.950+01:00Matroska versus fragmented MP4In an <a href="http://robux4.blogspot.fr/2017/10/foms-and-demuxed.html" target="_blank">earlier post</a> I was worried that Matroska might have lost its edge compared to MP4 when it comes to overhead size. So I dug a little deeper with some real life samples from no other than Apple to see if what we could improve. It turns out that <b>Matroska is still the best</b> when it comes to overhead (and just about everything else).<br />
<br />
Here are some comparison from the <a href="https://developer.apple.com/streaming/examples/" target="_blank">Apple adaptive streaming sample page</a>. I don't how they compare to real life files, maybe they are improperly muxed but the results are always in the favor of Matroska even when large padding and tags are left in the file.<br />
<br />
<h2>
Advanced Stream</h2>
The lowest bitrate video is 530kbps according to the manifest and 369 kbps according to <a href="https://mediaarea.net/en/MediaInfo" target="_blank">MediaInfo</a>. Then I remuxed it with <a href="https://mkvtoolnix.download/index.html" target="_blank">mkvmerge</a>. Then go through <a href="https://www.matroska.org/downloads/mkclean.html" target="_blank">mkclean</a> and here are the results:<br />
<ul>
<li>27 672 619 original fMP4 with H264</li>
<li>27 449 794 mkvmerge with default options (we win already)</li>
<li>27 447 068 mkclean with default options</li>
<li>27 439 197 mkclean with <i>--live</i></li>
<li>27 357 090 mkclean with <i>--remux --optimize</i></li>
<li>27 349 220 mkclean with <i>--remux --optimize --live</i></li>
</ul>
The normal usage when preparing with for streaming would be mkclean with --remux --optimize and that gives a 1.1% size advantage that could be better used for the codec. That stream even includes checksums, tags and is fully seekable.<i></i><br />
<br />
<h2>
Advanced Stream HEVC</h2>
Here Matroska doesn't have the advantage of using Header Compression as with H264, which saves 3 bytes per frame as they are always the same. The 145 kbps is also closer to the limit of everyday files.<br /><ul>
<li>11 492 052 original fMP4 with HEVC</li>
<li>11 410 786 mkvmerge with default options (we win already)</li>
<li>11 407 257 mkclean with default options</li>
<li>11 371 266 mkclean with <i>--remux --optimize</i></li>
</ul>
But we're still 1.1% percent smaller than the same content in fragmented MP4.<br />
<h2>
Advanced Stream H264</h2>
This is the same as above but in H264 format, so we get to use header compression.<br />
<ul>
<li>10 663 861 original fMP4 with H264</li>
<li>10 558 115 mkvmerge with default options (we win already)</li>
<li>10 554 002 mkclean with default options</li>
<li>10 498 886 mkclean with <i>--remux --optimize</i></li>
</ul>
<h2>
Conclusion</h2>
So Matroska is still the best when it comes to overhead and still keeps all its advantages. Only very very small fine tuned files might actually go in favor of fMP4. I'd really like to have such real life samples if you have some.<i> </i><br />
robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com1tag:blogger.com,1999:blog-7329601.post-69974697965661739592017-10-25T08:48:00.001+02:002017-10-25T13:57:22.491+02:00FOMS and Demuxed<div dir="ltr" id="docs-internal-guid-e7edb72c-523a-db8b-2343-9ec8d2d70988" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">On October 3rd and 4th I attended the <a href="http://www.foms-workshop.org/foms2017/" target="_blank">FOMS workshop</a></span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;"> in San Francisco then <a href="http://demuxed.com/" target="_blank">Demuxed</a> on the 5th. There were a lot discussions about video, mostly distribution and playback via web browsers. It was interesting as it’s a different take from my daily work on <a href="http://www.videolan.org/" target="_blank">VLC</a>. Vendors developed very specific techniques targeted at their particular use case, often to get around bogus past decisions or competing solutions.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">As Matroska (used by <a href="https://www.webmproject.org/docs/container/" target="_blank">WebM</a>) was primarily designed for playback over network connections (that were slow at the time of design) it was interesting to see if we can cover all these use cases in an optimal way. It is especially important to remain relevant as the AV1 codec is coming soon. It seems to be getting huge traction already and might end up being the main codec everyone uses in the years to come, especially for web videos. Even though it’s targeted at high quality it seems people want to use it ASAP for very low bitrates. I suppose the quality gain for the same bitrate is even more significant there.</span></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJF1Fp6x9M0OGJHaSYrPBCRBDCUSyzRu6rgoWaB3ItdXsxiz58NLblnLCRbxdMfv7eZRnBDn61FoL5nH6jc_gGueAdbwKaczGr2ahyPoOgwqZRO_RdfzAfa5cC2IqPUFPS1yE/s1600/FOMS2017.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="847" data-original-width="1600" height="210" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJF1Fp6x9M0OGJHaSYrPBCRBDCUSyzRu6rgoWaB3ItdXsxiz58NLblnLCRbxdMfv7eZRnBDn61FoL5nH6jc_gGueAdbwKaczGr2ahyPoOgwqZRO_RdfzAfa5cC2IqPUFPS1yE/s400/FOMS2017.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">FOMS 2017</td></tr>
</tbody></table>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Two subjects particularly caught my attention in terms of challenges for the container.</span></div>
<h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 16pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Extremely low latency </span></h2>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It seems a lot of companies are looking at reducing the time between the moment something happens and the time it’s displayed on your screen. In the age of Twitter it sucks to see a goal or other (e)sport event happening on your feed before you actually get to see it. In games it also means the people streaming the game can <a href="https://www.youtube.com/watch?v=R3Ki-w84rP4&t=743" target="_blank">interact in real time with what people are seeing</a>.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Due to the nature of video encoding you can hardly get lower than one frame delay (17 ms in 60fps) and the transmission latency (10 ms if you have an incredible ping). But right now the target is more around a few second or a single second. One of the issue here is how adaptive streaming is currently used. It encodes a bunch of frames and then tell the user it’s available (in various bitrates). That’s because the container needs to know all the frames it contains before it can actually be used. So they wrap about 1s of video to have a minimum latency.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Matroska and EBML have a mode called live streaming. It allows writing frames as they come in and never rewriting the beginning of the file to tell how much data it contains or where the data actually are. So you can start reading the file even while it’s being written. Many years ago GStreamer was used to stream conferences that way (without even an actual file being written) and that’s how VLC 3.0 sends videos to the Chromecast. This is also how most Matroska/WebM muxers work. They write in “live streaming” mode by default: they write a special “unknown” value in the length field and when the size is known this value is overwritten. So a streamer can create files on the fly that people could start reading. And when the file is done write the proper values so that the next people reading from that file actually get proper values they can use to seek.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">I hope the web people get a look at this as it would allow to go way below the 1s latency target they currently have. It would also work for adaptive streaming as you still get Clusters that you can cut in many parts on a CDN as currently done for WebM. This solution is already compatible with most Matroska/WebM readers. It’s been in our <a href="https://www.matroska.org/downloads/test_w1.html" target="_blank">basic tests suite</a> for at least 7 years</span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">. </span></div>
<h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 16pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">CMAF</span></h2>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">I learned the existence of a new MP4 variant called <a href="https://lists.aau.at/mailman/listinfo/mpeg-cmaf" target="_blank">CMAF</a> (Common Media Application Format</span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">). It’s an ISOBMFF profile based on Fragmented MP4 (fMP4). It was developed by Microsoft and Apple. The goal was to use a similar format between DASH and HLS to reduce the cost of storage on CDNs and get better caching. In the end it might not be of much use because the different vendors don’t support the same DRM systems and so at least 2 variants of the same content will still be needed.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">This is an interesting challenge for Matroska as with AV1 coming there will be a battle for what container to use to distribute videos. It’s not the main adoption issue anymore though. For example Apple only supported HLS with MPEG TS until iOS10 so many Javascript frameworks <a href="https://open.nytimes.com/improving-our-video-experience-part-one-our-on-demand-video-platform-cf818e03353d#d1f8" target="_blank">remux the incoming fragmented fMP4 to TS on the fly</a> and feed that to iOS.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Regular MP4 files were not meant to be good for progressive downloading, nor fragmented playback needed for adaptive streaming as the index was needed for playback and so needed to be loaded beforehand and not necessarily at the front of the file. The overhead (the amount of data the container adds on top of the actual codec data) wasn’t not great either. So far it was a key advantage towards Matroska/WebM as these were two of the main criteria when the format was designed 15 years ago. There were cases where MP4 could be smaller by at the price of using compressed headers. The situation changes with fMP4 and CMAF. In fact the overhead is slightly lower than Matroska/WebM. And that’s pretty much the only advantage it has over Matroska.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">On a 25 MB file of 44 kbps (where overhead is really hurting) the difference between the fMP4 file and one passed through <a href="https://www.matroska.org/downloads/mkclean.html" target="_blank">mkclean</a> is 77 KB or 0.3%. It may seem peanuts, especially for such a small bitrate, but I think Matroska should do better.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Looking at the fMP4 file, it seems the frames are all packed in a blob and the boundaries between each frame in a separate blob (‘trun’ box). And that’s about it. It must only work with fixed frame rates and probably allows no frame drop. But that’s efficient for the use case of web video over CDNs that were encoded and muxed for that special purpose. There’s hardly any overhead apart from the regular track header.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">One way Matroska could be improved for such a case would be to allow frame lacing for video. It is already used heavily for audio to reduce the overhead and since audio doesn’t need a timestamp for each block, the sampling rate is enough (except when there are drops during recording, in which case lacing is not used). We could allow lacing video frames as long as the default duration for the track is set (similar to a frame rate) and that each frame has the same characteristics in the Matroska Block, especially the keyframe flag. So keyframes would stand alone and many other video frames could be laced to reduce the overhead, the same way it’s done for audio. With such a small bitrate it could make a significant difference. On higher bitrates not really, but the overhead difference between fMP4 and Matroska is probably small if not at the advantage of Matroska in this case (thanks to header compression).</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">I will submit the proposal to the CELLAR workgroup of the IETF (Internet Engineering Task Force), a group that is currently working on specifying properly EBML, Matroska but also FFv1 and FLAC. This is not a big change, it’s just something that we didn’t allow before. And because it’s already in use for audio in just about every Matroska/WebM file that exists, the parsing already exists in current players and may work out of the box with video frame lacing. It doesn’t add any new element.</span></div>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">The advantages of Matroska over MP4 remain the same for fMP4.</span></div>
<ul style="margin-bottom: 0pt; margin-top: 0pt;">
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It can handle a lot more codecs (VP8, VP9, Vorbis, Opus)</span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It can be produced on the fly (see above for extreme low latency)</span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It doesn’t require a special mode for streaming/progressive download and another for local storage or archiving, it’s always the same format</span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Over 15 years of existence there has never been <a href="https://www.loc.gov/preservation/digital/formats/fdd/fdd000342.shtml" target="_blank">any patent claim over anything we use</a>. This may not be the case for <a href="https://www.loc.gov/preservation/digital/formats/fdd/fdd000079.shtml" target="_blank">ISOBMFF where Apple and Matsushita hold patents</a> </span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">and <a href="https://concolato.wp.imt.fr/misc/mpeg-4-systems-patent-pool-analysis/" target="_blank">maybe others</a> </span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">and in <a href="http://blog.chiariglione.org/on-my-charles-f-jenkins-lifetime-achievement-award/" target="_blank">general from MPEG technologies</a> </span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">like <a href="https://www.youtube.com/watch?v=75cdIHLWanY" target="_blank">HEVC</a> </span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">and <a href="http://www.mpegla.com/main/programs/MPEG-DASH/Documents/DASHWeb.pdf" target="_blank">MPEG-DASH</a>.</span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It was created out in the open (IRC, mailing lists) and continues to be developed in the open on <a href="https://datatracker.ietf.org/wg/cellar/about/" target="_blank">the IETF mailing list</a></span><span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">.</span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-family: Arial; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; list-style-type: disc; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">It’s totally free, even to get access to the specifications. </span><br />
<br /></div>
</li>
</ul>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjAvP58PvVbLQt0ZXel9E3AEeMtQawYhfqYSoocB00s25w3tDREYsu8A6ak0APUPnrI6pmg_UkQyjXYTefwG750ZYtpD1rxBnB3zhtc2oNJvcGDa71U69WpamuqYbzpFPL2gnE/s1600/Demuxed.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1068" data-original-width="1600" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjAvP58PvVbLQt0ZXel9E3AEeMtQawYhfqYSoocB00s25w3tDREYsu8A6ak0APUPnrI6pmg_UkQyjXYTefwG750ZYtpD1rxBnB3zhtc2oNJvcGDa71U69WpamuqYbzpFPL2gnE/s400/Demuxed.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Demuxed 2017</td><td class="tr-caption" style="text-align: center;"><br /></td></tr>
</tbody></table>
<h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 16pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">TL;DR</span></h2>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: "arial"; font-size: 11pt; font-style: normal; font-variant: normal; font-weight: 400; text-decoration: none; vertical-align: baseline;">Matroska has a lot to offer to web distribution, like one frame latency <i>at scale</i> not possible with ISOBMFF formats, doesn’t require new designs for current and future use cases and is the most open and free solution.</span></div>
robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com2tag:blogger.com,1999:blog-7329601.post-91288498038263666942014-07-08T08:13:00.002+02:002014-07-08T08:26:26.409+02:00Android Versions by The MillionsLast month I blogged about the <a href="http://robux4.blogspot.fr/2014/06/android-version-distribution.html" target="_blank">missing market share progress graph</a> that Google used to publish. And also provided some extra graphs based on the collected data, with much interesting facts to extract from them. I updated these data/graphs with the <a href="http://www.xda-developers.com/android/latest-android-platform-stats-kitkat-nearly-18-overtakes-gingerbread-but-growth-slows/" target="_blank">latest Play Store stats</a>. But there was something significant missing and what really matters for app developers: how many actual people are on the versions, in millions.<br />
<br />
To get this information, we need to know how many active users there are. We never had this information until Google I/O 2014 where Sundar Pichai announced that there are <a href="http://www.theverge.com/2014/6/25/5841924/google-android-users-1-billion-stats" target="_blank">1 billion users active on the Play Store at that time</a>. The other regular piece of information I could find was the quarterly worldwide shipment of Android devices since 2010 up to April 2014. Using this data and a lifetime of 18 months for each shipped device, I managed to reconstruct the progression in millions of active devices and get to the 1 billion number we have now. The math may not be all sound, but in the end the growth is pretty linear from the beginning and the milestones from each IO keynote seem to coincide (number of activation vs active device). All these data are added to <a href="https://docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/edit?usp=sharing" target="_blank">the original spreadsheet in the page "Active Users"</a>.<br />
<br />
Given these grossly accurate data I could build the graph of each version progression in millions of users, not just in market share. <br />
<br />
<br />
<iframe frameborder="0" height="371" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=1819264364" width="646"></iframe>
<br />
The road to 1 billion has been pretty linear. The last quarter global shipment are unknown yet. And they also take in account an explosive growth in China where the Play Store is not available.<br />
<br />
Another interesting graph, and the real information I was looking for is how many users are currently using each API.<br />
<br />
<iframe frameborder="0" height="371" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=366291995" width="646"></iframe>
To compare with the original one<br />
<br />
<iframe frameborder="0" height="371" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=1461561914" width="646"></iframe><br />
<br />
You can see the story is very different.<br />
<ul>
<li>KitKat was the fastest growing platform in recent Android history and it's showing even more by million of users. If the growth continue like that it may reach 300 million users in the next 3/4 months. Before Android L comes out.</li>
<li>Although API v16 has been slowly declining for a while, the platform was still growing and so the number of users was still growing, the market share alone is not a good indicator. The number of users are in free fall though, despite still being the dominant API.</li>
<li>API v10 has still 140 millions of active users, these are not Chinese users.</li>
<li>There were never more Ice Cream Sandwich users than Gingerbread users. It topped 200 millions, compared to 300 millions for Gingerbread.</li>
<li>The growth of API v17 is more significant when taking in account the amount of users, it's still growing well.</li>
<li>On the other hand, API v18 is still not very meaningful in the number of users.</li>
<li>There are still 7 million users using v8. </li>
</ul>
The good news is that supporting v17 and up gives you a very good amount of users. But failing to support v10 or v15 gives 250 million more potential users to your competitors.<br />
<br />
We can assume that in the next 3 or 4 months v10 and v15 will drop below 100 million users each. And v19 should reach 250 to 300 million users and might have more users than v17.robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-38620204998687894912014-06-14T17:32:00.000+02:002014-06-14T17:32:17.291+02:00Android Version Distribution<span style="font-family: Arial,Helvetica,sans-serif;">A long time ago, Google used to provide a graph of the evolution of version distribution with their monthly update of the <a href="https://developer.android.com/about/dashboards/index.html" target="_blank">Android Dashboards</a>. I missed this graph ever since because it was giving a good indication of where we're at and what to expect in the coming months. This is especially important when planning a new project, to know what most users will have when your product ships.</span><br />
<br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">After digging in the <a href="https://archive.org/web/" target="_blank">Internet Archive Wayback Machine</a>, I reconstituted all the data published by the Play Store since December 2009 up to now (June 2014). The result is this <a href="https://docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/edit?pli=1#gid=0" target="_blank">bare spreadsheet table</a> with a link to the source I used for each line.</span></span><br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwQ9pT8WnMOPfLwo3gbHHgX5nlDVkyQWqKzyMOyKZWF31VGGw2hsLkuF82cXnqjJB__yyC9_PAGYdIB124tbtim-Y2LmNL9ZXOh7O6C5IcdzrEvZF9_-KjhRG_ULwUp2j8o_o/s1600/spreadsheet.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwQ9pT8WnMOPfLwo3gbHHgX5nlDVkyQWqKzyMOyKZWF31VGGw2hsLkuF82cXnqjJB__yyC9_PAGYdIB124tbtim-Y2LmNL9ZXOh7O6C5IcdzrEvZF9_-KjhRG_ULwUp2j8o_o/s1600/spreadsheet.png" height="297" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="font-family: Arial,Helvetica,sans-serif;">Reconstructed Play Store Statistics</span></td></tr>
</tbody></table>
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">With the data in hand, it's now easy to create the graph that Google used to publish. But as soon as you see it (see in the <a href="https://docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/edit?pli=1#gid=0" target="_blank">Google Doc spreadsheet</a>, after the numbers), you realize it makes more sens to group the Android versions by their codename (Froyo, Gingerbread, Honeycomb, Ice Cream Sandwich, Jellybean and Kit-Kat). This gives this chart:</span></span><br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span>
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><iframe frameborder="0" height="447" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=1478038078" width="729"></iframe></span></span><br />
<br />
<span style="font-family: Arial,Helvetica,sans-serif;">This graph looks familiar, just with more versions in it. Each major version seems to have the same life cycle</span><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"> and you can see that Jellybean is currently very dominant, that you get from the official monthly pie chart. What you don't get is the idea of how things are probable to move in the next 3 to 6 months. For example Gingerbread and lower currently represent 15% of the active Play Store users. When will it reach 10% ? Judging by previous Android versions, it took 4 months to get from 15% to 10%. So that would be in October of 2014.</span></span><br />
<br />
<br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">This area view is nice, but there are other ways to represent the evolution by simply plotting the numbers in a line graph, as follows:</span></span><br />
<br />
<iframe frameborder="0" height="432" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=1966468336" width="732"></iframe>
<br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">This graph, IMO, gives a much better idea of how things went for each version and the real importance of each version. Here are some noteworthy points:</span></span><br />
<ul>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">Each major version has a similar life cycle. A rapid growth and then a slow "logarithmic" decline.</span></span></li>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">The decline starts 4 months after the next main Android version is released (4 for Froyo, 5 for Gingerbread, 4 for ICS, 4 for Jellybean)</span></span></li>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">Honeycomb never had much of an impact</span></span></li>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">Ice Cream Sandwich was never more popular than Gingerbread</span></span></li>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">During Google I/O 2013, the most used Android version was Gingerbread (so much for <i>minSdkVersion=14</i>)</span></span></li>
<li><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">Around April 2013 Froyo and Gingerbread lost a lot of market share at the benefit of Jellybean (harder to see in the area graph) </span></span></li>
</ul>
<br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">There's still another way to plot the data, from the time the version was introduced and counting how many months it was in use. That gives the following graph:</span></span><br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span>
<iframe frameborder="0" height="443" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=176045407" width="719"></iframe>
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span><span style="font-family: Arial,Helvetica,sans-serif;">There's plenty of extra information that can be found from this graph.</span><br />
<ul>
<li><span style="font-family: Arial,Helvetica,sans-serif;">Since Gingerbread, the evolution during the first months of each version is very similar</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">Older versions of Android were growing more rapidly to their peak</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">After a slow start Kit-Kat has caught up with the growth of Jellybean</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">We can project that Gingerbread will be below 10% in 4 months, ICS will be below than 10% in 2 months </span><span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"></span></span>
</li>
</ul>
<br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;">With Google I/O 2014 on the way and Android 5.0 likely to be revealed there, we can get an idea of the fate of Kit-Kat. It will be similar to the one of Ice Cream Sandwich. Of course, Jellybean is a bit cheating here, since it had 3 major versions. Here is a version of the graph, not grouped by main versions and starting at API v7.</span></span><br />
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span>
<iframe frameborder="0" height="371" scrolling="no" seamless="" src="//docs.google.com/spreadsheets/d/1xloWINIqJXKZlJDv_w9CLNDo9WDylvFUSQyjeAaUZFI/gviz/chartiframe?oid=1461561914" width="646"></iframe>
<span style="font-family: "Trebuchet MS",sans-serif;"><span style="font-family: Arial,Helvetica,sans-serif;"><br /></span></span><br />
<br />
<span style="font-family: Arial,Helvetica,sans-serif;">More points can be found from this graph:</span><br />
<ul>
<li><span style="font-family: Arial,Helvetica,sans-serif;">Obviously the older versions had less versions to share with so had more market share.</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">In the recent years API v16 is the one with the most market share by itself.</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">Kit-Kat (v19) is growing faster than all the versions since Gingerbread (except for API v18)</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">API v15 has reached its peak 15 months ago</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">API v16 has reached its peak 6 months ago</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">API v17 is growing very slowly but still growing</span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">All popular recent versions reached their peak around 15 months of existence </span></li>
<li><span style="font-family: Arial,Helvetica,sans-serif;">Versions like v14 or v9 almost never existed</span></li>
</ul>
<br />
<b><span style="font-family: Arial,Helvetica,sans-serif;">Conclusion / TL;DR: Know Android Users Through Fragmentation Graphs</span></b>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-79833099476678479522013-03-17T16:45:00.001+01:002013-03-17T16:45:21.744+01:00ControlA while back I created a company in France on the side of my day job in hope to make a little cash on the side. I created GAWST. My first idea was to make some programs for iOS and OS X as people to pay more for softwares there. I had various projects lined up. The more advanced one was a replacement for the crap Finder, something more in tune with Windows Explorer. Then Apple introduced the App Store and plenty of rules and restrictions. It not possible anymore to sell a file explorer there. The App Store being the easiest and most obvious way to sell content there, I decided it wasn't worth the efforts (working on OS X was never my cup of tea).
<br />
<br />
So I switch to Android. I wanted something like Ghostery for Android, a way to see the kind of traffic goes to sites I don't want and a way to block it. So I started working on a proxy written from scratch in java. Something lightweight and fast. It took me a while to get it right but now it's pretty stable. I can see and block all the traffic just like I wanted. And I can do a lot more with this little tool. But then last week Google removed all the ad blockers from the Play Store. I planned to sell a restriction unlocker via the Play Store (while basic functionalities would be available to everyone). And once again all the time I spent on this project is wasted because a large company wants to have more control with what we can do with our machine.<br />
<br />
I don't know yet what I will do with the code of gawsttp (my proxy). I use it every day on all my devices and it's useful, even a debug tool (or to test Twitter's API blocking). But what is pretty clear is that I won't try again to make a commercial project of my own anytime soon. I'm tired of all the rules that are only good for the big players, despite real user demand.robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com2tag:blogger.com,1999:blog-7329601.post-39827292022772068472012-03-24T18:04:00.011+01:002012-03-25T10:41:29.896+02:00RésistanceWhen I read <a href="https://groups.google.com/forum/#%21msg/mozilla.dev.platform/-xTei5rYThU/DkM9AIbkNNIJ">this discussion on Mozilla Dev</a> thinking about supporting H.264 I felt like I had to say something. Not just as the creator of Matroska (on which WebM is based) but also as a general net citizen.<br /><br />Ever since I started working on video technology I realized how vast and encompassing the subject is. In this particular case it's about letting patents dominate a fundamental web technology. It's about who is controlling the technological stack when you go on the Internet (aka market shares). It's about a technical decision that is actually changing the whole principle of the web: <a href="http://www.w3.org/Consortium/Patent-Policy-20040205/">the technologies it uses have to be Royalty Free</a>, be it on the user side, on the server side or on the creation tools side.<br /><br />So far Mozilla was <a href="http://robert.ocallahan.org/2010/01/video-freedom-and-mozilla_23.html?m=1">opposed to using H.264</a>, not for technological reasons but on the principle that it didn't fit the Royalty Free web. They were also saying it was “<span style="font-style: italic;">not good for free softwares</span>”, that if used “<span style="font-style: italic;">you [couldn't] have a completely free software Web client stack</span>”, that would be “<span style="font-style: italic;">honoring the letter while violating the spirit</span>” or that it “<span style="font-style: italic;">is not a game [they] are interested in playing</span>” and eventually “<span style="font-style: italic;">it pushes the software freedom issues from the browser (where [they] have leverage to possibly change the codec situation) to the platform (where there is no such leverage)</span>”. None of what was said then has changed. So what has changed must be Mozilla.<br /><br />When BootToGecko was announced my first reaction was whether they would support H.264 when it's hardcoded in the phone chipset. Because answering this question has a lot of repercussions. The amount of reaction to their answer is proof that the issue is deep. In fact it goes further than just H.264 which is hardcoded and paid for on the hardware (I suppose a lawyer should double check on that to be sure). It also means supporting the MP4 file format, <a href="http://wiki.multimedia.cx/index.php?title=MP4_File_Format_Patents">which is also patented</a> but not hardcoded in chipsets, unless they plan to only support H.264 inside Matroska (they already have the WebM code to do it easily), but I doubt that. At some point they will likely support MPEG DASH which made the W3C uneasy because it's not yet a Royalty Free technology. And there may be other patented technologies in the future.<br /><br />The W3C will never endorse the use of the H.264 codec, it doesn't fit its rules. Vendors using it for the <video> tag are just using a backdoor to put patented technologies on the web, where it should have never been in the first place. Now the message is sent loud and clear, there are no reason big companies shouldn't try to impose their technological dominance on the web (Skype and Facebook come to mind). Everyone will surrender in the end. The bigger you are, the biggest momentum you have, the more chance you have to succeed. And patents are just as virus like as the GPL license: once you touch them you can't escape from them...<br /><br />Maybe Mozilla is just acknowledging the fact that the web is going away from its core roots and what made it such a global success. But it still seems shortsighted to me. The timing of their announcement comes just weeks/months before new devices with VP8 hardware decoding are widely released. If they wanted to emphasis their continued support/favor for WebM they could at least have first released B2G on a device that has VP8 capabilities. Instead the message sent to those wondering if they should support WebM is: don't bother anymore, in the end we'll have H.264 everywhere. Such a decision to kill WebM should have been taken collectively. Especially since it's just to satisfy the need of a project that currently has 0 users and will have to compete against Apple, Google and Microsoft on the smartphone/tablet market. Billion dollar companies that heavily depend on this market for their future. Whereas Mozilla's future depends on product placement from the same companies. But that may change if I read <a href="https://hacks.mozilla.org/2012/03/video-mobile-and-the-open-web/">this</a> correctly: “We will not require anyone to pay for Firefox. We will not burden our downstream source redistributors with royalty fees.“ Payment not required, but possible ?<br /><br />To be fair, Google deserves a big share of the blame as well. Even though they provided VP8 to WebM, they didn't put all their weight in the battle. They have done too little too late. And there is no sign of that changing. They even failed to deliver on their <a href="http://blog.chromium.org/2011/01/html-video-codec-support-in-chrome.html">promise of abandoning H.264 in Chrome</a> to favor the technology they built and is perfect for the web. If there ever was a strategy for WebM I have never heard of it.<br /><br />Good reads on the subject:<br /><ul><li>A roundup of what's at stake here: <a href="http://arstechnica.com/gadgets/news/2012/03/idealism-vs-pragmatism-mozilla-debates-supporting-h264-video-playback.ars">http://arstechnica.com/gadgets/news/2012/03/idealism-vs-pragmatism-mozilla-debates-supporting-h264-video-playback.ars</a></li><br /><li>A point of view I share completely: <a href="http://www.osnews.com/story/25715/Mozilla_forced_to_consider_supporting_H_264">http://www.osnews.com/story/25715/Mozilla_forced_to_consider_supporting_H_264</a><br /></li><br /><li>A wider shift on the web is already happening: <a href="http://gigaom.com/2012/03/23/open-vs-closed-what-kind-of-internet-do-we-want/">http://gigaom.com/2012/03/23/open-vs-closed-what-kind-of-internet-do-we-want/</a></li><br /><li>Sir Tim Berners-Lee on the open web: <a href="http://www.scientificamerican.com/article.cfm?id=long-live-the-web">http://www.scientificamerican.com/article.cfm?id=long-live-the-web</a></li></ul>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com8tag:blogger.com,1999:blog-7329601.post-82251573265722528352011-01-11T23:27:00.033+01:002011-01-13T19:21:52.058+01:00The Open Web Conspiracy<div>So Google has announced that it will drop support for H.264 in future iterations of its Chrome web browser. <a href="http://www.webmproject.org/">WebM</a> will be the favored format for video rendering inside the web browser. Until this announcement very little people really knew about WebM. It was considered as a pet project of Google, like so many others. But with all the press of the last few days, it's impossible to ignore WebM anymore.</div><br /><div>What people didn't realize is that Google paid a good amount of money for this technology. Not to make even more money out of it, but simply because the video inside the web browser was going nowhere. It was either the expensive H.264 or the "low" quality Theora for those who can't afford it. So Google bought On2 and VP8 with it and made it free and completely open. The exact kind of technology that is <a href="http://www.w3.org/Consortium/Patent-Policy-20040205/">mandatory in W3C standards</a>. They also converted most of their YouTube library to WebM (in 360p and 720p IIRC) which probably also costs a lot in storage.</div><br /><div>The tone of most reactions to this move was "how dare you Google?". Which is pretty astonishing when you think about it. People blamed Google for abandoning H.264 in a web browser. Completely ignoring the fact that Mozilla and Opera have always been on that position. AFAIK the web has not collapsed because of this decision. In fact FireFox is now the dominant browser in Europe and Chrome is the fastest growing one. I don't see how this move is going to hurt anybody already serving video on the web.</div><br /><div>Not only that but Mozilla and Opera are the ones to thank for the existence of HTML5, the ones that stood against MSIE and it's proprietary way of interpreting the CSS standards. Once again there's going to be a war on standards. This time Google is joining the party. And once again the standard should win in the end. It's not a matter of who has the biggest one, but what an open web means and doesn't mean.</div><br /><div>Everyone who has been involved in the Audio/Video business knows how much patents are a nightmare to deal with. A real way of stopping innovation. It also costs a lot of money for small players, to the point were real business is almost impossible if you don't have a size large enough. At CoreCodec a large amount of our revenues were going each year to the MPEG LA for the use of MP3, H264 and AAC. On many products we didn't make any money other than what we had to pay for patents. If there are better business alternatives, everyone should embrace them.</div><br /><div>In the flow of articles/comments I read a common argument coming back was that H.264 is already there. Yes, but almost noone is using the <video> tag right now. It's only at the experimental stage. They confuse the web browser and the entire video ecosystem. Everyone who's interested in the story of that <video> tag knows that although a great idea it was a problem due to browser fragmentation and the lack of agreement on the codec, H.264 being out of the question for many key players and likely the W3C who approves the standards. Now there's a solution in sight. Again a solution for the <b>web browser</b> not for the whole world of video. It's not a first arrived/first served system. If H.264 doesn't fit the bill there is no point continuing (or ever) supporting it. Because in the end something else will have to be used. This fact has always been known by all users of the <video> tag from the beginning.</div><div><br /></div><div></div><div>Also it's not because something is there that any future development should stop. In fact everyone working with the <video> tag knows that it's an evolving technology that is bound to change until it stabilize. There are a lot of things coming like adaptive streaming (IMO the most important part of streaming that will make browsers useful players), transparency (for better integration of video in rich UI), 3D (3D WebM files are already playable on YouTube) and even likely some form of DRM if the browser is going to become the universal "cloud" video player. All of these changes will require many investment for people providing online video. The sooner they know what are the possible choices and the ones that will be the best investment, the better.</div><div><br /></div><div>Plus the video on the web is still in infancy. Adaptive streaming is coming (and IMO the real stopper right now for real use of the browser as a video player), live streaming is coming, 3D is coming (should browsers support MVC too like on Blu-Ray?), transparent videos (for better integration in UIs) and likely DRMs will come too if video rentals are ever going to work without Flash/Silverlight on a desktop. So don't assume that was exists now is set is stone and should never change. All these technologies will require work and money from the people serving videos. The earliest they know the options and the likeliness of success, the better choices they can make. And there is no reason investing time and money in something that will not last in the end. What is clear now is that WebM is not a pet project, it's here to stay.</div><div><br /></div><div></div><div>I've been involved in the development of WebM even before it was out (no surprise that I'm a supporter then). And one thing that stroke me so far was that the most active in its development were Mozilla (3D and live streaming) and Opera (live streaming). Chrome has always been trailing with new features of WebM or even bug fixes. Now Google is finally putting its money where its mouth is. It's being pragmatic AND ideological, not one OR the other. The support of Flash being the pragmatic part here. Not only that but <a href="http://www.webmproject.org/about/supporters/">Adobe has been a supporter of WebM since day one</a> (and even before). I wouldn't be surprised if Flash would support WebM in the future. That would make sense for them too. If that ever happens, WebM will be the format that would play on all platforms, unlike H.264. And would likely be decisive in making WebM the first choice when putting video on the web.</div><br /><div></div><div>In the end one can always think of it as a big conspiracy from Google, Mozilla and Opera to free the web from audio/video patents, and keep the World Wide Web utopia alive and kicking.</div>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-91649348718128458802011-01-08T13:33:00.005+01:002011-01-08T21:18:07.206+01:002011: The Year Apple Went Back To Its NicheHello I'm Steve Lhomme. You may know me from the <a href="http://www.matroska.org/">Matroska</a> (.mkv and .webm) format I (mostly) created, or working for years on CorePlayer from <a href="http://www.corecodec.com/">CoreCodec</a> on platforms like Windows Mobile (5 & 6), Symbian, PalmOS, iOS, Android, etc or lately from my work on the Plume Twitter client on Android for <a href="http://levelupstudio.com/">LevelUp Studio</a>.<div><br /></div><div>For many years I've followed and been involved in the rise of mobile computing and smartphones in particular. I've used and developed for about any of these OS'es (that would allow some form of native code). In that time I've seen the growth of the iPhone (from the early version without a SDK where we had CorePlayer already running) up to the point it became almost a world of its own. And with great power comes great responsibility. Except that Apple has been acting very aggressively, for their own profit.</div><div><br /></div><div>Recently I've started noticing an unfair bias towards Apple from the press and from friends. Either because one gives credit to Apple for more than they deserve or because it gives advantages to the Apple platform that IMO is not good (more on that later). While Apple was mainly the only player with a modern platform, it was fine. But now that Android is on level with iOS and exploding in market shares, it's time to rethink the old habits.</div><div><br /></div><div>The current situation is that on the OS level Android is more or less at the level of what iOS offers. The OS was not meant for tablets and thus the ones sold so far leave to be desired. It seems that Honeycomb (Android 2.4 it seems) will leap forward and make Android more than decent on tablets. I personally think the tablet market is overrated, but that's another debate. On the hardware level there is certainly a lot more experiment & innovations on the Android side. We now have dual-core, tiny devices, up to 4.3" screens, 4G/LTE, <a href="http://www.engadget.com/2011/01/07/nvidia-and-fujitsu-tens-android-car-nav-hands-on-video/">car dashboards</a>, 5" and 7" tablets, tablets as TV companions from <a href="http://gigaom.com/video/android-tablets-as-tv-remote-controls/">Vizio</a> or <a href="http://www.engadget.com/2011/01/05/panasonics-android-based-viera-tablet-unvieled-at-ces-2011/">Panasonic</a>, etc. And depending on your needs, there's usually one device that is exactly what you need. Just like not everybody wants to be dressed the same, not everybody needs the same from their phone/tablet.</div><div><br /></div><div>But with all the CES announcements, I've seen plenty of news/comments on Apple fanboy sites, that whatever, the iPhone/iPads are better. Even before Honeycomb was demonstrated and devices were tested by real persons. For many Apple has become a religion to follow, and denying all other "gods" that are not theirs. This is not new (Mac vs PC). But it's always surprising when it comes from bright minds. And it's even more dangerous when a whole economy has been built on feeding solely the iOS ecosystem (and 30% of it in Apple's pockets).</div><div><br /></div><div>Things have changed radically since the last 6/12 months. Android is now dominant globally and even in the symbolic USA market. Unlike iOS it's completely free and open. That's why hardware and software innovations are now happening there. And the trend is not going to change. The head start that Apple had is now gone.</div><div><br /></div><div>The freedom in Android means anyone with a Windows, Mac, Linux computer can develop for free their software and run it on their devices without paying anything to Google. That is not possible with iOS. One can easily see which one is going to be picked by coders in developing markets. Not only that, but Android doesn't require a PC for synchronization or system updates. In doesn't require you to put your billing information in iTunes before buying apps. In a world where more and more people use a (smart)phone and not a PC, that's an important growth factor. One may argue that poor people don't buy apps. But they are likely to buy food, detergents, gas, etc. So the advertising model can work in these areas. So I think the smartphones are going to be a lot bigger there too. It's a lot better if it doesn't require a PC.</div><div><br /></div><div>Because of all this, I think Android is going to explode even more this year. And unless Apple has something special about to be released (rumors don't seem to show that), it's going to lose even more significant market shares. And like the Mac vs PC war, it's going to end up in a niche for trendy/hip/rich people. Which is probably fine for Apple as long as it has a good share of the #2 position. History is just repeating.</div><div><br /></div><div>NOT sent from my iPhone (yes, I still use one)</div>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com6tag:blogger.com,1999:blog-7329601.post-1171188118355916132007-02-11T10:47:00.000+01:002007-02-11T11:01:58.366+01:00Upgrade your music/moviesSo a lot of people are talking Steve Jobs comments on the DRM issues. I haven't read it yet. But it made me think about the digital economy for content/entertainment.<br /><br />People have been buying music at 128 kbps (MP3 or AAC or WMA) or DVDs in 480i resolution, and with DRM. If they want to get a better quality music/video or without DRM, they would probably have to <span style="font-weight:bold;">buy it again</span>. In software you usually get free upgrades or pay less than the full product for an upgrade (even Microsoft does it). So it should be the same for music. If you want to buy the same music in lossless or transparent, you shouldn't have to buy the whole thing again. That was necessary when you had to buy a physical object to upgrade your qualiyt (tape/LP to CD, VHS to DVD). But from digital to digital, there's no need. The only problem is to do that you need some DRMs :(<br /><br />So the non use of DRM might actually mean there won't be cheap upgrades... How ironic !robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com2tag:blogger.com,1999:blog-7329601.post-1169460860243796982007-01-22T11:10:00.000+01:002007-01-22T11:14:20.243+01:00The Universe Is Too Big For AliensAccording to this study, we haven't really met the aliens (officially speaking) because the universe is too big and it would take them too long to probe it and find us at the time we are developped.<br /><br />There's probably some truth in this, but it considers that travelling at the speed of light is not possible or even close. But it's already possible in labs. It doesn't take in account teleportation that already exists in labs too, the use of dark energy, maybe telepathy and all that stuff we don't understand yet. So I think the answer is not there yet... Maybe we are the first ones to be that advanced in the universe ? Or that the matter is not travelling, only the souls ?robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1169459982082531622007-01-22T10:59:00.000+01:002007-01-22T11:01:16.996+01:00New Concepts CatalogThis article lists all the modern concepts that will shape the future of science and intelligence.<br /><br />I haven't had time to read them all and the related articles, but a lot of them seem interresting.robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1169459557466096032007-01-22T10:47:00.000+01:002007-01-22T10:52:37.616+01:00DivX is byeIt's now official. I'm going to leave DivX, Inc. on the 31st of January. I will be concentrating on the CoreCodec products like <a href="http://www.coreplayer.com/">CorePlayer</a> and also on matroska related stuff.<br /><br />For the whole time at DivX I have worked on DrDivX which is open source and available on SourceForge. I hope the project will still continue and evolve even after I left. Maybe I will still contribute here and there. Especially since it relies on a CoreCodec product: CoreMake to create the project files on Windows (MSVC++ Express) and Mac (XCode).<br /><br />I decided to leave because I've been working on 2 jobs at the same time for many months and it's exhausting. Now that CoreCodec has some financial stability I can safely switch to this prefered job and work on the technologies that we create.robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1168422410649557052007-01-10T10:42:00.000+01:002007-01-10T10:46:50.660+01:00The revolution according to AppleSo yesterday Apple unveiled the iPhone. There's a lot of buzz about it, but once it gets down what will remain ? Engadget list a few of the missing features. But Apple presented it like there was nothing close to this before...<br /><br />So what's new about this (smart)phone ? The big touchscreen. Even a technology they call multi-touch. So maybe the revolution for Apple is to be able to use 2 buttons (fingers) when the Apple computers only do one (unless you buy a 3rd party mouse).<br /><br />That's it. All the rest exists and is available elsewhere. The iPhone will be available in June in the USA and in Q4 in France. Far from revolutionary and a big late. If Nokia release their internet tablet that does 3G phone they are done...robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1167755546401021732007-01-02T17:28:00.000+01:002007-01-02T17:32:26.416+01:00Free WillWhat is free will ? IMO that's (so far) the main difference between machines and living objects. And it's more than a technical debate, as it implies that we are just machines and that sophisticated machines could have it too (as an inate property ?).<br /><br />This article covers what free will might be after all... That's the first time I see it mentioned, while I've been convinced that's a crucial point for the future. If machines can have free will too, then they can decide on their ownn (and without us).<br /><br /><blockquote>A bevy of experiments in recent years suggest that the conscious mind is like a monkey riding a tiger of subconscious decisions and actions in progress, frantically making up stories about being in control.</blockquote><br /><br /><blockquote>“If people freak at evolution, etc.,” he wrote in an e-mail message, “how much more will they freak if scientists and philosophers tell them they are nothing more than sophisticated meat machines, and is that conclusion now clearly warranted or is it premature?”</blockquote>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1152204287578716652006-07-06T18:41:00.000+02:002006-07-06T18:44:47.593+02:00EU Governments aware of the machine riseThis article is mostly about what the BT (British Telecom) manager said about the importance of women in the future. But there's this interresting quote :<br /><br /><blockquote>"The government is aware of this trend," he insists. "The EU is looking into it, not just in terms of machine intelligence, but as a problem of globalisation and machine intelligence leading to a surplus of men. It doesn't want large numbers of unemployed men standing around on street corners. We will see lots of demonstrations"</blockquote><br /><br />It surely is the biggest challenge of the coming decade. A whole new/different society to build, based on new values (or just the good old money...).robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1151859048976791932006-07-02T18:46:00.000+02:002006-07-02T18:50:48.990+02:00Big Brother is the future (and vice versa)A nice article about Vernon Vinge's views on the future. And especially about security and privacy. It's clear that people will want more privacy, but technology will also allow to monitor people more efficiently on whatever they do. It's hard to know where the thin limit will be, and nobody has really started to work on that among politicians...<br /><br />An interresting quote too about the real economic power :<br /><blockquote>The leaders of most powerful countries are coming to realise that the most important natural resources are not factories or the size of armies. Economic power is in the size of the population that is well-educated, creative and generally happy enough to be optimistic enough to want to do something creative.</blockquote>robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1146299176614191862006-04-29T09:38:00.000+02:002006-04-29T10:26:17.280+02:00France is the enemyI usually like the NYT articles since they are usually well informed and give a good overview on a subject. But this article got me wondering...<br /><br />Apart from the doubtful humour about a foreign country and their citizens (What has possessed the French?) the article is just a plead for Apple's cause, no less. Apparently the author considers that Apple is the only online music retail store worth considering. And that it needs to be protected as such. If that's not a defense for a de-facto monopoly I don't know what it is.<br /><br />The author completely ignores the fact that worry most people about the iTMS and DRM in general: locking content in a proprietary solution. When you buy on the iTMS you can only use the content on your computer or your iPod. The iPod doesn't support any other form of DRM and the Apple DRM is not available to license. So if you have $200 worth of content from Apple and for some reason your computer crashes you lose all of it. And if you want to listen to your music on a device on another device you can't. And that's precisely what the french law proposes to get rid of. Make sure that the user has control on the content he bought, not the companies that make hardware and softwares.<br /><br /><blockquote>penalize companies that harm consumers, not the ones that succeed by building better products.</blockquote><br /><br />Better products doesn't mean they don't harm consumers. In the long run locking yourself in a technology and having no technical means to get rid of it (legitimately as the law puts it) is a hidden suicide... I'm still waiting for a class action against Apple by users pissed that they can't use their content elsewhere than the Apple world. Then we'll see who is harming consumers (with a monopoly enforced by digital lockers).<br /><br /><blockquote>Apple largely created the online market for legal music.</blockquote><br /><br />I think there were many other DRM solutions before Apple presented theirs.<br /><br /><blockquote>Second, iTunes has lots of music. Largely because of the innovative iTunes FairPlay copy protection and digital rights management software, Apple persuaded major record labels to let them sell much of their best content online. The combination of simplicity and variety proved a huge winner.</blockquote><br /><br />There have been reports about the recent attemps by record labels wanting to change the price policy of the iTMS. The idea was to have new content more expensive and older content cheaper. That sounds totally fair. But Steve Jobs and its marketing decided that the unique price was better. It's funny that the technology provider (because of its market position) think they can impose how content owner should sell their stuff... The result is that there won't be a deal renewal, but the content will remain on the iTMS. The difference is that content owner could pull out their catalog at any time, turning the iTMS into a useless technology and the iPod owners in despair. The huge success of the iTMS will turn into a catastrophy... So in short I don't think the picture of labels being happy with an all in one solution (where they have no voice) is a realistic picture.<br /><br /><blockquote>If the French gave away the codes, Apple would lose much of its rationale for improving iTunes. Right now, after the royalty payment to the label (around 65 cents) and the processing fee to the credit card company (as high as 23 cents), not to mention other costs, Apple's margin on 99-cent music is thin. Yet it continues to add free features to iTunes because it helps sell iPods.</blockquote><br /><br />At least it's clear to everyone what the FairPlay is about: sell iPods. That's the very reason why Apple make their best not to open it to any partner (just look at the flop of the iTunes phones).<br /><br />Now would opening (doesn't mean breaking) Fairplay would boost the iTMS: yes. Would it boost music sells: yes. Would it boost iPod sales: likely if you can convert WMA DRM to Fairplay. So what is wrong about opening it ?<br /><br /><blockquote>Apple argues that sharing the codes could make the pirates' job easy enough to wipe out the legal market.</blockquote><br /><br />That's considering licensing technology is a public publishing. AFAIK Microsoft Janus (DRM) is licensed to many companies and there's no tool online to crack it. This argument is just FUD created by Apple and spread by "journalists".<br /><br /><blockquote>Agitators might claim that this is the very goal of the French bill: why else would it also reduce the maximum fine for consumers caught illegally downloading music from 300,000 euros (about $371,000) to just 38 euros (less than $47)?</blockquote><br /><br />It's funny how the journalist considers paying 300,000€ for downloading a track online is OK. The 38€ fine is per download. Only when a law is fair it can be respected.<br /><br /><blockquote>Usually, rich countries don't meddle with others' intellectual property because they fear retaliation. So why don't the French fear retaliation now?</blockquote><br /><br />Retaliation against France. Now we have big words to defend Apple against France (one of its biggest market in Europe)... In short the author is saying: don't mess with a USAn company or we'll retaliate on other markets. I guess he/she must have a lot of Apple stocks to go so bold on an industrial (and actually a cultural) choice.<br /><br /><blockquote>One reason may be that they have concluded France will never really compete. If the Internet will always have an American accent, why not go after it? Sometimes, the red flag of revolution is surprisingly hard to distinguish from the white flag of surrender.</blockquote><br /><br />There were 2 options in the law: global license which would be a tax on download or DRM interoperrability. Both are considered good routes to make the digital economy fair and flourish. So I see innovation possibilities where the author see surrender to a monopoly.<br /><br /><blockquote>Before declaring pre-emptive war on iTunes, however, perhaps the French would do best to remember a lesson from 1789. Sometimes the very people calling for revolution are the ones who end up losing their heads.</blockquote><br /><br />So far I haven't heard any voice in France against the proposed law to defend iPod. All the voices are crying that there will be less space for private copy. So the author is just fantasazing and maybe he thinks Bush will attack the french (once more) to defend Apple. That will make the french even more proud...robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1146162829334155982006-04-27T20:29:00.000+02:002006-04-28T08:18:37.153+02:00Lilly Is HereAs any other father I want to tell the world that my baby Lilly was born yesterday (2006-04-26) at 21:35 in Hawaii. She weighs 3.9 kg (8.6 pounds) and measures 53cm (21 inches). She has dark hair and blue eyes.<br /><br />It's both a wonderful and happy moment as much as a sad one. Unfortunately I couldn't <a href="http://robux4.blogspot.com/2006/03/america-no-more.html">be there for legal reasons</a>. But Cecilia and the baby are doing fine so that's just enough to be happy. We'll be together in Europe around june.<br /><br />Welcome to the world Lilly!robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1145687834956981832006-04-22T08:10:00.000+02:002006-04-22T08:37:14.996+02:001366x768 & 750GoIn the modern world advertising for false information can be punished. Yet tech companies have found legal ways to cheat on their product specifications.<br /><br />My LCD TV claimed to do 1366x768 pixels, as many (most) of them do. But when I plugged it to my computer only 1280x720 (720p) were available. I thought there might be something wrong in my setup even though I didn't notice any resizing effect typical for LCD (blurry when not in native). I only found out the trick with an <a href="http://www.hdbeat.com/2006/04/21/whats-the-deal-with-1366-x-768/">HDBeat article</a>. The TV does only have 1280x720 real pixels. They only pretend to have more due to the overscan found on analog TVs (analog TVs cut the borders of an image). So there will never be 1366x768 physical or displayed pixels. This number is just imaginary.<br /><br />Seagate also announced a 750 GB harddisk drive a few days ago. But don't expect your OS to report 750 GB when you plug it. HDD manufacturers have decided a long time ago that 1 GB = 1,000,000 bytes and not 1,024x1,024 bytes. It didn't make such a difference for small disks. But now the difference is 34.75 GB (real bytes). If they keep the same trend, a 1 TB disk will actually contain 953GB (70GB or 5% less).robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com0tag:blogger.com,1999:blog-7329601.post-1145093110137384902006-04-15T09:54:00.000+02:002006-04-15T11:25:10.190+02:00DRM mythsI don't want to become a DRM advocate, but when I read articles like <a href="http://www.theregister.co.uk/2006/04/15/lessig_stallman_drm/">this</a> I feel like a more balanced view should be expressed.<br /><br />Both Lessig and Stallman are smart people. But it turns out they are very idealistic, and unrealistic, in their opinion on DRM and freedom in general.<br /><br /><blockquote>The values of the Free Software Movement are the freedom to cooperate, and the freedom to have control over your own life. You should be free to control the software in your computer, and you should be free to share it.</blockquote><br /><br />The GPL that Stallman wrote (or was the main driving force) doesn't give you the freedom to share. It is an obligation. You replace a freedom with a constraint. It really feels like someone is deciding for you what kind of freedom is good for you. So it's always surprising to see him attack other Open Source licenses (that sometimes give you all rights) in the name of freedom.<br /><br />Now about DRM, the freedom given to people (restricted rights) is exactly what they paid for. If they want more rights they should pay more money, and if they agree on less rights, they should pay less. That's the very basic reason why renting a DVD is cheaper than buying it. And the market, in other words the consumers, will decide what kind of rights they want. It's just unfortunate that the whole content market is ruled by oligopolies (on movies and on music) and therefore the DRM offers/choices are very scarce. But this has nothing to do with why DRM is good or bad.<br /><br /><blockquote>the whole point of DRM is to deny your freedom and prevent you from having control over the software you use to access certain data.</blockquote><br /><br />When smart people start using stereotypes that means there's something fishy going on. After all the whole GPL relies on copyright laws. And Stallman, as such, is a strong defender of copy rights (copyleft as they call it) and intellectual property. But apparently electronic ways to ensure the rights are always respected are not good. He probably considers that doing this in court is preferrable, even for people who earn no money on what they create (like most GPL devs do).robux4http://www.blogger.com/profile/17314970638635879042noreply@blogger.com4