My thoughts on Slackware, life and everything

Tag: google (Page 1 of 16)

Chromium source tarball availability

Someone asked how I am creating Chromium (also -ungoogled) packages these days? When you download my SlackBuild script and attempt to build the package yourself, the script will error out because it cannot download the sources.

For weeks now, the Google automation is broken with regards to creating Chromium source tarballs. Apparently some bug prevents their CI/CD pipelines from succeeding. This is reportedly fixed for the Chromium 132.x releases (currently their Beta versions) but Chromium 130 and now also 131 releases are announced without their accompanying source tarballs.
I have therefore created a script which fetches the needed stuff from git and packs that into a “chromium-$VERSION.tar.xz” tarball. You will find it in the “build” directory and it’s called “fetch-chromium-release.sh“. You run that script with a Chromium official version number, wait a long long time, and you end up with the 6+ GB source tarball which you can then move into your local “build” directory.

Hope that clarifies things and helps you compile chromium or chromium-ungoogled yourself.
Have fun! Eric

July update for Chromium 126

The latest release of the Chromium source code (version 126.0.6478.182 was made available on July 16th) addresses several vulnerabilities as usual, some of which are rated as ‘High’ but none ‘Critical’ and also no new 0-days are reported.

You can fetch my Slackware 15.0 and -current packages both for chromium and (hopefully soon, because its source has not been released yet) also chromium-ungoogled . You may prefer one of the mirror servers (like my own US server and in a short while, the UK mirror) in case my primary slackware.nl server is not responding or too slow.

I am slowly transitioning back from being a caregiver for my mother (which absorbed me completely for 10 months) to a life where I have time and energy in the evenings to hack on Slackware again. You will soon find the next installment in the Slackware Cloud Server series here on the blog – an article dedicated to building a Docker stack as cloud storage backend for the open source Ente Auth 2-factor authentication app.

Cheers, Eric

Authenticators for 2FA

Multi-factor authentication: it is difficult to find high-profile websites these days that allow you to get away with a simple password-based login. It’s a sobering thought to realize how fast your ‘secure’ password can be hacked using sophisticated techniques that go way beyond brute-force cracking.

So, multi-factor authentication has become the rage. When you authenticate yourself, you increase the security of your account by providing multiple ‘factors‘: something you know (a password or PIN code); but also something you have (a cryptographic identification device or a token); and something you are (which could be a biometric quality such as a fingerprint or a face-iD).

When requiring 2 of these factors,  we talk of ‘two-factor authentication’, better known as ‘2FA’. Then, usually these will be the use of a password combined with the string of digits (a token) produced by an authenticator – whether that is a hardware device or a software implementation.

Popular are authenticator apps on smartphones. Google, Apple and Microsoft have their own authenticators which you can find in the respective stores for your smartphone. They are really easy to use and completely interchangeable – every authenticator will generate the exact same code for a website at the same moment in time.
The disadvantage of these authenticators becomes clear when you lose your smartphone… gone are the authentication codes you need to logon to your account! You’ll have to contact customer support to disable your 2FA so that you can access your data again, and then re-enable 2FA using an authenticator on your new phone.

That’s why Authy became so popular: this is an authenticator which stores your 2FA tokens securely in the company’s (Twilio) cloud storage. With Authy, you can authorize another device (smartphone or desktop) to generate the same 2FA codes for you. And as long as you remember the passphrase which encrypts your cloud-stored tokens, you do not need your original phone to authorize a new phone. Really convenient!
Unfortunately, Authy does not offer a way to export your tokens from their app.  It’s total vendor lock-in as it happens so often. And this month, Authy’s Windows and Linux desktop applications stop working, leaving only Android and iOS as supported platforms for your authenticator. On top of that, there was a recent breach of Authy’s cloud storage, leaking 30+ million email addresses associated with Authy accounts. That facilitates phishing attacks of course, but also, when you try to recover your account after the loss of your phone, Authy would first ask for your phone number and then continue granting access to the related account. Security updates to Authy apps on all platforms are now preventing application initialization based on your phone number, but it speaks a clear message: if you cannot fully trust the company providing you with one of two authentication factors, it may be time to switch.
But, the lack of export capability… indeed.

I have been using Authy for a couple of years, precisely because of the convenience it offers in the rare case that you lose (access to) your phone. Now being really pissed about the vendor lock-in, I went to look for an acceptable alternative authenticator. And I found Ente Auth. It is an open source 2FA authenticator, with the option (not mandatory) to create an account at Ente and sync your local 2FA tokens to their cloud server.  The end-to-end encryption used by Ente has been independently audited,  and the app allows you to both import (from other authenticators that are not Authy) as well as export your tokens. Ente’s server offers a read-only version of the authenticator interface which means, after login you can find your 2FA codes in your browser as well.
Switching from Authy to Ente Auth was a slow and painful proces, where I had to disable and re-enable 2FA on many web sites, but now I am ready to use Ente Auth exclusively. I can only highly recommend this app.

What’s more: Ente has also open-sourced its backend server. Ente is first and foremost an open source and secure alternative to Google Photos or iCloud: a place to store your photos and videos. But the authentication backend has been built as a standalone functionality from the start, which allowed the company to build Ente Auth around that backend. By open-sourcing the backend, you can actually have complete control over cloud-storage of your 2FA tokens! An account on ente.io is then not needed, you simply instruct the authenticator app to connect to your own server address.
And as a bonnus, you also get a secure and self-hosted alternative to Google Photos.

If there’s an interest in a follow-up article explaining how to self-host the Ente Auth server backend, let me know in the comments section below.

Have fun! Eric

Chromium update fixes 5th zero-day exploit for 2024

In Google’s release notes for the latest Chromium 124.0.6367.201 source code it is  mentioned that this release fixes a zero-day vulnerability. Beware: this is already the 5th zero-day which was reported and fixed in Chromium in 2024.

This vulnerability is already actively exploited in the wild, and is labeled CVE-2024-4671, so please upgrade your chromium and also ungoogled-chromium packages as soon as you can.

You can fetch my Slackware 15.0 and -current packages both for chromium and chromium-ungoogled . You can also visit mirror servers (like my own US server and in a short while, the UK mirror) in case my own server is not responding or too slow.

Note that I still do not provide 32bit package updates for Chromium and Chromium-ungoogled. It is a lot of work to find out how to compile rust and llvm on 32bit Slackware ‘the Google way’ and so far the solution has eluded me.
I need these custom rust and clang compilers to compile Chromium sources on 32bit.
And please don’t tell me ‘to look at how Debian does it’ – it does not help.

Cheers, Eric

Chromium 121 for Slackware… don’t hold your breath

Chromium 121 sources were released yesterday, and as much as I would like to tell you that the Slackware packages are ready, in fact it appears that you will have to wait for them for an unspecified amount of time.

I found out that the build of Chromium now needs Google’s custom version of the Rust compiler, next to Google’s custom version of the Clang compiler. Those Rust and Clang versions are intertwined and Google advises packagers to simply use their own pre-compiled binaries which they provide for download.

You guessed… those binaries are not available for a 32bit OS. Nothing new, and it is for that exact reason that as part of compiling Chromium for Slackware, the complete LLVM toolchain is built from Google’s sources first. For every package I release. Tweaking the LLVM/Clang compilation so that they work for 32bit Slackware took a lot of time – after all, no one at Google tests their sources for 32bit build compatibility. So I patch here and there and every time feel lucky that it still works.

Until today, when I ran into the new Rust requirement. And after the umptiest iteration of a Chromium package build using a variety of changing options, I still fail to even start compiling a Rust binary.

I am taking a break from this to consider my options. My aim is to keep supporting the 32bit Slackware package. I just need to figure out how Google messed this up again and find a way around it. In the meantime, don’t hold your breath – I only have a few hours each evening to do the troubleshooting. A new package will appear when it’s ready.

All the best, Eric

Update 2024-jan-29: I have buillt 64bit packages for Chromium (also -ungoogled) version 121.0.6167.85 and uploaded them to my repository.
Note that I can not currently compile their 32bit versions because until now I have not been successful in building Google’s custom llvm and rust from source. I had to revert to downloading and using Google’s pre-compiled binaries which they only supply for 64bit systems.

I am still determined to find a way to compile these llvm and rust compilers from Google’s own sources. But I have no ETA on that unfortunately.

« Older posts

© 2024 Alien Pastures

Theme by Anders NorenUp ↑