Tags: fosdem, conference
Like others, I’m sitting in the train from Brussels to Paris after a very intensive week-end. Indeed, FOSDEM 2023 did happen. After being suspended for two years during the pandemic, at least for the in person version, it is back. I couldn’t ignore this and had to go!
Now let’s see for my quick and very own biased recap of the week-end.
I managed to attend a few talks and take written notes. I miss my old tablet, I would have preferred to get back into sketch noting. Maybe next time…
A nice talk from Daniel Stone who really knows the topic in depth. I admit I expected something a bit less high level though. Still, if you are curious about how the graphics stack is structured on Linux it was definitely worth attending.
It does a very good job about explaining DRM/KMS, EGL, GBM respective roles and how they align. Also shows how Wayland is purposefully aligned with KMS. I managed to get a few extra information I didn’t expect on the topic.
It’s been a few years now that I end up attending a talk I didn’t expect for various reasons. This year it was definitely this one, I didn’t mark it down at all on my hit list but ended up being in the Open Media room when it was presented.
It was definitely outside my comfort zone and still I found it very interesting and accessible enough. It explained quite well HDR content display and how AV1 handles it.
I admit I was a bit fascinated by the part talking about testing the rendering and if the standards are properly respected. It’s clearly a very complex work. Lots of very subtle details to get a test lab right. Indeed you even need to take care of the immediate environment of the lab since it can impact the perception of colors!
Unexpected as I said but I don’t regret it.
Nowadays I’m definitely getting more curious about the energy use of the software we use daily… and what is more commonly used nowadays than your web browser really?
Well, this is in part why the Mozilla people added energy profiling to the Firefox Profiler. It’s not only important from a sustainability, it also has a direct impact on the user experience in terms of noise, heat and battery use.
This type of profiling requires to look at CPU use, GPU use, thread wake ups, network use. They tended to waste energy a lot due to wake ups, invisible animations, timers. They introduced tooling around the task manager to ease hunting down those problems.
This coupled to telemetry allowed them to have a global picture of how the browser operates in the wild. The total daily consumption globally would require a small power station to satisfy it. Definitely not a small thing!
Anyway, it all allowed them to make several targeted fixes which improved the situation greatly. Still more improvements needed of course and experiments to run to see how they’ll behave in the wild.
Interesting stuff and definitely a tool I’ll use in the future, because… well see the next talk.
I stayed in the same room to know more about the new features around the Firefox Profiler. Unfortunately it started late and needed to be a bit rushed.
They added nice improvements to it recently like - power profiling (with Watts and CO2 estimations) - source code view and inline call stacks - one click profiling for tabs - localization - lots of documentation improvements
But the thing I’m the most excited about is the importers they added. This turns it into a generic interface to explore profiler data and this one being very nice and thorough… Right now they can import traces from: chromer, perf, ART trace and callgrind. There’s even a Java JFR profiler compatible with it.
I find it really cool to see an ecosystem growing around it.
This one was really not what I expected. I was looking for something on how FOSS project can do better in terms of Environmental Sustainability, it was more a survey about FOSS projects aiming at environmental sustainability topics. Fair enough, I was wrong, still worth listening to.
Unfortunately I didn’t find it very well delivered, the pace was a bit too slow or maybe energy was a bit lacking. Also they were some biases in the information collected or what they were looking at which rubbed me the wrong way.
Still I got a few interesting nuggets out of it. There seems to be an overly large presence of the communities, academia and governmental agencies in such projects. Companies are almost nowhere to be seen. Somehow the incentives must be lacking.
It also showed how well green washing unfortunately works. The current investment ratings in term of sustainability are definitely untrustworthy. For instance, the disconnect between carbon credits and the amount of carbon storage really happening is staggering.
I thus agree with the diagnostics, the models used for those ratings should be opened. This is necessary for provability and traceability. This indeed makes Open Source the most underestimated strategy for climate sustainability. Without it we can be easily misled.
For this one the title was very much misleading. It had nothing to do with a deep dive and was very much high level.
The advices were sound of course but I expected to learn more and get more tips on how to squeeze performance out of queries not only diagnose where the bottlenecks are.
Still I got away with a few tools to look at and assess if they can really help us
- SQLcommenter to pass metadata to the database;
pg_stat_monitor to get deeper query performance insights.
So clearly not the best talk I’ve seen. The tools mentioned might be worth checking out.
I was very curious to get news about Matrix progress. Also Matthew Hodgson is a good speaker and entertainer so hoped it’d be time well spent.
It definitely was! The amount of progress from the Matrix community is just very impressive. In his demo packed talked we’ve been shown: - how much the performance overall improved - how the multiparty VoIP features are doing great - the possibilities to have down the line a Matrix in P2P mode - and their new ThirdRoom client which provides virtual spaces features, I’m especially excited about its use of glTF as the base scene format and the fact that it works with the WebXR APIs
It was also great to see that more governments and agencies are making use of it now. It’s clearly spreading and getting traction. Hopefully the Digital Market Act will push further that momentum and the development of the bridges.
Very interesting times for this protocol, let’s hope it keeps going strong.
It’s been a few years that a few enioka Haute Couture developers have been attending FOSDEM. This year more of us did attend and also we managed to bring people from our sister company enioka Consulting with us. Our group thus exploded in size with a head count well above 15!
I thought it was a good opportunity to run a little experiment and asked them each to tell me about their favorite talk of the week-end. Here is what they came up with.
DuckDB: Bringing analytical SQL directly to your Python shell
This is a column database embedded in a Python process (although other bindings are available). It’s kind of a column based sqlite providing zero copy integration with Pandas and Numpy. All of this while providing a 100% SQL based interface.
Building a Semantic Search Application in Python, Using Haystack
How to easily create semantic search applications using this NLP framework and models from HuggingFace. This was an interesting talk about a tool which will be very useful to us.
Automated short term train planning on OSRD
How to insert a train at the last minute in a trafic plan defined weeks ago. The SNCF network works on this problem WITHOUT using AI!
Similarity Detection in Online Integrity
Collection of tools developed by Meta to identify pictures and videos to be considered “illegal”. It’s interesting to see the diversity of solutions they use, ranging from hash comparison to neural network. The platforms collaborate there through NCMEC, a solution to share detected illegal content. If you’re concerned your nudes might leak you can get them on the NCMEC list and avoid their publication. Forgotten from this talk is obviously the content moderators. Meta claims to rely on its “community” (sic).
pip install malware
A dynamic talk which shows the best practices and things to pay attention to when producing a library published via a package manager (like pip). It covers how malevolent packages can be installed by typo squatting, “hidden” dependencies, popularity theft (starjacking) and how to prevent it.
Kerberos PKINIT: what, why, and how (to break it) An accessible explanation of Kerberos and of the PKINIT extension allowing to use X509 certificates to authenticate users. I also covers how its FreeIPA implementation could lead to privilege escalation.
A deep dive inside the Rust frontend for GCC
Nice presentation and progress report on the GCC frontend to compile Rust code.
OpenSTEF: Open Source energy predictions
A use case of Open Source tool to manage energy. In particular to predict congestions on the electrical grid.
Understanding the energy use of Firefox
How telemetry helps to figure out Firefox’s energy consumption, and how its profiling tools allow to optimize this consumption.
Melrōse, a music programming environment
A python framework to generate MIDI sequences. It comes with an advanced language to write melodies, harmonies and rythmes altogether. For me the main interest in the framework are the methods which allows to easily compose. Also the code is reinterpreted at each iteration so the generated sequence can be modified from one loop to the next.
Kotlin Multiplatform for Android & iOS library developers
Return on experience of two developers using Kotlin for Android and iOS native applications. They showed how to best leverage Kotlin by fine tuning and working around OS specific bugs with annotations.
Application Monitoring with Grafana and OpenTelemetry
Or how to monitor your applications with Open Source tools to store and explore as a whole the traces, logs and metrics.
As you can notice, I didn’t attend many talks… That’s because FOSDEM main value comes from the “hallway track”. It’s best to meet and chat with people as much as possible. And I can tell you it did happen!
It was a real pleasure to reconnect with people I appreciate and love there. Hugs, conversations and drinks were in order. Thank you all for that, hope to see you all again soonish!
It was also nice to meet new friendly faces. For instance, I used the opportunity to congrats the Penpot people for announcing their official launch (what they’re doing is really exciting and important, I wish them success).
But now it’s over… Time to cross the border again and crash on my bed. Exhausted but satisfied.