Alien Pastures

My thoughts on Slackware, life and everything

Page 23 of 174

Kernel 5.13 went into Slackware-current; new Live ISOs to celebrate

This week, the Linux 5.13 kernel finally landed in the core distro of Slackware-current after having lived in ‘./testing‘ for a while. Actually it’s the 5.13.2 kernel which had some 800 patches applied compared to 5.13.1.

And with that new 5.13.x kernel series, it looks like a Slackware 15 release is again closer on the horizon.

To celebrate the occasion, I have generated a new batch of ISOs for the Slackware Live Edition. At download.liveslak.org you will find ISO images for the variants based on the latest release of the liveslak scripts (in this case, version 1.3.9.5 of liveslak).
These Live ISOs boot Slackware -current “Thu Jul 15 22:37:59 UTC 2021” with the new 5.13.2 mainline kernel: SLACKWARE (32bit & 64bit), XFCE (32bit & 64bit), CINNAMON, MATE, DAW and LEAN.

The ‘bonus‘ section contains a batch of squashfs modules you can use with your liveslak (copy them into the ‘addons’ directory of your persistent USB drive) if you want to expand the functionality of the Live OS.
Among these you’ll find the binary nvidia driver (already contained in the CINNAMON, DAW and MATE ISOs by the way); the Broadcom STA wireless driver, Wine 6.12, multilib, the DAW package collection, and a set from my own repository (chromium, libreoffice, veracrypt, vlc etc).

Have fun! Eric

How I setup Apache for slackware.nl

Someone asked how I achieved the refreshed look & feel on slackware.nl and download.liveslak.org with the fancier directory listings and a graph of the current network bandwidth usage at the bottom.

It’s not so difficult but if you are new at setting up a web site mainly oriented at content delivery and have not worked with dynamic page generation using server-side includes (SSI), it may be useful to have some kind of reference.

So I decided to write up a short explanation on how I configured Apache httpd for the above two web sites. We’ll just imagine that you want to serve a slackware-current mirror from http://darkstar.home.arpa/slackware/slackware-current/

Bandwidth usage graph

Let’s first look what I did to generate that graph of the network bandwidth usage. Actually I stole the idea from slackware.uk where I noticed it for the first time. I had no idea what kind of tool Tadgy was using there, so I applied my Google-fu and found bwbar, a nifty little program written by H. Peter Anvin. I applied a few patches to the source code and built a Slackware package for it and installed this on my online server.
Next step is to decide where bwbar should write its bandwith usage data so that my Apache webserver can display that data. For the sake of this instruction, let’s assume a completely default unmodified httpd configuration. This means, web content goes into “/srv/httpd/htdocs/”. My customizations will also go into “/srv/httpd/” but I want to keep those files outside the htdocs directory. That’s just me, I want to keep content and presentation separated.
Further down, you will see that I use “Alias” statements in the httpd configuration to bind these directories into the htdocs content area (the so-called DocumentRoot).

I chose “/srv/httpd/pres/” as the location where the web server will find its presentation data. I want the ‘apache’ user to own that directory so that it can write the bwbar output there:

# mkdir /srv/httpd/pres
# chown apache /srv/httpd/pres

Then a commandline added to “/etc/rc.d/rc.local” will start bwbar during boot-up (as the apache user) and make it generate and refresh a graph (ubar.png) and accompanying text (ubar.txt) every 60 seconds:

# Start bandwidth reporting:
/bin/su - apache -s /bin/bash -c "/usr/bin/bwbar \
--output \
--directory /srv/httpd/pres \
--text-file ubar.txt \
--png-file ubar.png \
--width 600 --height 4 --border 1 \
--interval 60 \
--Mbps \
eth0 1000 & "

Once this commandline executes, you’ll notice it starts writing in “/srv/httpd/pres/”. It will keep running in the background and refresh the bandwidth usage data every 60 seconds.

Apache httpd.conf

Apache httpd configuration files are found below “/etc/httpd/”. Again, assuming a completely out-of-the-box installation, the following lines get added to the bottom of “/etc/httpd/httpd.conf”

# Use your own hostname here of course:
ServerName darkstar.home.arpa:80

# Load the module that enables server-side includes (SSI):
LoadModule include_module lib64/httpd/modules/mod_include.so

# We only actually need SSI inside the 'presentation' directory:
<Directory "/srv/httpd/pres/">
    # Make your .html page executable if you want httpd to parse it for SSI:
    XBitHack on
    Options +Includes
</Directory>

# This is where we will host our slackware-current mirror.
# We want every web page there to have a fancy header and footer,
# and the actual directory content inbetween:
<Directory "/srv/httpd/htdocs/slackware/">
    HeaderName /pres/HEADER.html
    ReadmeName /pres/FOOTER.html
</Directory>

# Our CSS files are outside the DocumentRoot so we need to bind them in:
Alias /css/ "/srv/httpd/css/"

<Location /css>
    Options -Indexes
    Require all granted
</Location>

# Our header, footer and bandwidth usage files are outside the DocumentRoot,
# so we need to bind them in:
Alias /pres/ "/srv/httpd/pres/"

<Location /pres>
    Options -Indexes +Includes
    Require all granted
</Location>

# Our icon files are outside the DocumentRoot so we need to bind them in:
Alias /icons/ "/srv/httpd/icons/"

<Directory "/srv/httpd/icons">
    Options Indexes MultiViews
    AllowOverride None
    Require all granted
</Directory>

# And then we add a whole bunch of icons to represent the content we may serve:
AddIconByEncoding (CMP,/icons/compressed.gif) x-compress x-gzip

AddIconByType (TXT,/icons/text.gif) text/*
AddIconByType (IMG,/icons/image2.gif) image/*
AddIconByType (SND,/icons/sound2.gif) audio/*
AddIconByType (VID,/icons/movie.gif) video/*

AddIcon /icons/binary.gif .bin .exe
AddIcon /icons/binhex.gif .hqx
AddIcon /icons/tar.gif .tar
AddIcon /icons/world2.gif .wrl .wrl.gz .vrml .vrm .iv
AddIcon /icons/compressed.gif .Z .z .tgz .gz .zip
AddIcon /icons/a.gif .ps .ai .eps
AddIcon /icons/layout.gif .html .shtml .htm .pdf
AddIcon /icons/text.gif .txt
AddIcon /icons/c.gif .c
AddIcon /icons/p.gif .pl .py
AddIcon /icons/f.gif .for
AddIcon /icons/dvi.gif .dvi
AddIcon /icons/uuencoded.gif .uu
AddIcon /icons/script.gif .conf .sh .shar .csh .ksh .tcl
AddIcon /icons/tex.gif .tex
AddIcon /icons/bomb.gif core

AddIcon /icons/back.gif ..
AddIcon /icons/hand.right.gif README
AddIcon /icons/folder.gif ^^DIRECTORY^^
AddIcon /icons/blank.gif ^^BLANKICON^^

DefaultIcon /icons/unknown.gif

# Here is where it starts getting interesting.
# This 'IndexOptions' statement will already improve the look and feel of a directory index a lot:
IndexOptions FancyIndexing HTMLTable VersionSort NameWidth=*

# Some stuff we don't want to be visible to our visitors:
IndexIgnore .??* *~ *# HEADER* README.html RCS CVS *,v *,t favicon.ico

# And we want to spice up the index pages with some custom CSS (Cascading Style Sheets):
IndexStyleSheet /css/index.css

# Add a bit of metadata to our index pages to help people find them using online search;
IndexHeadInsert " \
<meta name=\"description\" content=\"Alien BOB's Slackware Linux mirror site.\"> \
<meta name=\"keywords\" content=\"Slackware,Slackware,Slackware Linux,alienBOB,alien\">"

Cascading Style Sheet (CSS)

This is some nice CSS which I found at https://www.apachelounge.com/viewtopic.php?p=21764 . Create a separate css directory to match the httpd.conf snippet above:

# mkdir /srv/httpd/css
# vi /srv/httpd/css/index.css

And add the following content into that index.css file:

body {
    background:#ffffff;
}

table {
    margin:10px auto;
    width:1000px;
    background:#edeeff;
    padding:20px;
    -moz-border-radius: 15px;
    border-radius: 15px;
}
table tr td {
    padding:2px;
}

h1#indextitle {
    margin:20px auto;
    width:990px;
    color:#001133;
    background:#edeeff;
    padding:20px;
    -moz-border-radius: 15px;
    border-radius: 15px;
}

h1#indextitle {
    font-size:20px;
    font-family: "Segoe UI", "Bitstream Vera Sans", "DejaVu Sans", "Bitstream Vera Sans", "Trebuchet MS", Verdana, sans serif;
}

tr.indexhead {
    font-size:16px;
    font-family: "Segoe UI", "Bitstream Vera Sans", "DejaVu Sans", "Bitstream Vera Sans", "Trebuchet MS", Verdana, sans serif;
}
tr.even {
    background:#fffff7; /* */
}
tr.odd {
    background:#edeeff; /* */
}

tr.indexbreakrow th hr {
    border:0;
    height:1px;
    background:black;
}

th.indexcolname {
    text-align:left;
}
th.indexcolname a:link,
th.indexcolname a:visited {
    color:#323B40;
    text-decoration:none;
}
th.indexcolname a:hover {
    color:#a81818;
    text-decoration:none;
}

th.indexcollastmod {
    text-align:center;
}
th.indexcollastmod a:link,
th.indexcollastmod a:visited {
    color:#323B40;
    text-decoration:none;
}
th.indexcollastmod a:hover {
    color:#a81818;text-decoration:none;
}

th.indexcolsize {
    text-align:right;
    padding-right:20px;
}
th.indexcolsize a:link,
th.indexcolsize a:visited {
    color:#323B40;
    text-decoration:none;
}
th.indexcolsize a:hover {
    color:#a81818;
    text-decoration:none;
}

th.indexcoldesc {
    text-align:left;
}
th.indexcoldesc a:link,
th.indexcoldesc a:visited {
    color:#323B40;text-decoration:none;
}
th.indexcoldesc a:hover {
    color:#a81818;
    text-decoration:none;
}

td.indexcolicon {
    text-align:center;
}

td.indexcolname {
    font-family:verdana;
    font-size:14px;
}
td.indexcolname a:link,
td.indexcolname a:visited {
    color:#323B40;
    text-decoration:none;
}
td.indexcolname a:hover {
    color:#a81818;
    text-decoration:none;
}

td.indexcollastmod {
    font-family: "Segoe UI", "Bitstream Vera Sans", "DejaVu Sans", "Bitstream Vera Sans", "Trebuchet MS", Verdana, sans serif;
    font-size:12px;
    text-align:center;
}
td.indexcolsize {
    font-family: "Segoe UI", "Bitstream Vera Sans", "DejaVu Sans", "Bitstream Vera Sans", "Trebuchet MS", Verdana, sans serif;
    font-size:12px;
    text-align:right;
    padding-right:20px;
}
td.indexcoldesc {
    font-family: "Segoe UI", "Bitstream Vera Sans", "DejaVu Sans", "Bitstream Vera Sans", "Trebuchet MS", Verdana, sans serif;
    font-size:12px;
    text-align:left;
    padding-right:20px;
}

Page header and footer

The HTML in the pageheader shows a descriptive text and ensures that the page content is nicely centered in the browser window.

Add the following to “/srv/httpd/pres/HEADER.txt”:

<hr width="100%">
  <center>
    <table>
      <tr>
        <td align="left">
        Welcome to SLACKWARE.NL!<br>
        This mirror service is provided by Alien BOB
        </td>
      </tr>
    </table>

The HTML in the footer uses server side includes, which gives the pages their dynamic nature; when the HTML is parsed, the actual bandwidth usage data is included on the spot.

Add this to “/srv/httpd/pres/FOOTER.html”:

<center>
<br>
<img src="/pres/ubar.png" title="<!--#include virtual='/pres/ubar.txt' -->" alt="<!--#include virtual='/pres/ubar.txt' -->" width=80% height=10 border=0>
<br>
<!--#include virtual="/pres/ubar.txt" -->
</center>

And then make this file executable (!) which tells Apache httpd to parse the HTML for SSI (that’s what that XBitHack statement was for):

# chmod +x /srv/httpd/pres/FOOTER.html

Add content

Create the directory “/srv/httpd/htdocs/slackware” and download a “slackware-current” mirror into that directory.

Tetst your config

The commmand “apachectl configtest” will show you any issues with your httpd.conf that need to be resolved before you can start the web server.

Start the web server

Make the rc script executable so that Apache httpd will start on every boot, and then start it manually this time:

# chmod +x /etc/rc.d/rc.httpd
# /etc/rc.d/rc.httpd start

Finally, go have a look at your new Slackware mirror site at http://darkstar.home.arpa/slackware/slackware-current/ !

Let me know if anything is not clear, or if the file content above was not copied correctly.
Have fun! Eric

Chromium 90 packages – again 32bit related issues

There’s new ‘chromium‘ and ‘chromium-widevine-plugin‘ packages in my repository. And chromium-ungoogled packages will follow soon.

Chromium was upgraded to 90.0.4430.72 but unfortunately the 32bit package which I have built (twice) crashes immediately on startup. This happens both on Slackware 14.2 and on -current. The error message looks like it is something different than the glibc-2.3x related seccomp crash behavior in the previous Chromium 89.x releases.
Since I don’t have hardware that is running a 32bit Slackware OS and could only test on QEMU virtual machines so far, I can not confirm with 100% certainty that the new Chromium will or will not work on your 32bit Slackware OS, which is why I also kept the older 32bit chromium-89.0.4389.114 package in the repository.
Let me know about your experiences down here in the comments section! I am getting tired of begging Google developers not to break 32bit binaries every major release. I’d be grateful if people running 32bit Slackware who are affected by the 32bit Chromium crashes, would chime in. Else I will eventually have to drop 32bit support.

There’s also an updated Widevine plugin package, unfortunately Google only released this newer version for 64bit systems. The new 64bit package has version “4.10.2209.1” whereas the 32bit package remains at version “4.10.1582.2”.
Note that this Widevine plugin is meant for chromium-ungoogled only. The ‘real’ Chromium does not need or use it, since Chromium downloads this CDM library automatically for you.

Have fun!
Eric

Chromium security updates (and fix for 32-bit crash)

I have updated the ‘chromium‘, ‘chromium-ungoogled‘ and ‘chromium-widevine-plugin‘ packages in my repository.

For Chromium (-ungoogled) these are security updates. The new 89.0.4389.90 release addresses several critical vulnerabilities (it’s the third release in the 89 series in rapid succession actually, to fix critical bugs) but in particular it plugs a zero-day exploit that exists in the wild: CVE-2021-21193. You are urged to update your installation of Chromium (-ungoogled) ASAP.

I made chromium-ungoogled also available for Slackware 14.2, I hope that makes some people happy.

Since I had to build packages anyway, I took the opportunity to apply a patch that fixes the crashes on 32-bit systems with glibc-2.33 installed (i.e. on Slackware-current).
In that same chromium-distro-packagers group that is the home of the discussion about Google’s decision to cripple 3rd-party Chromium browsers, I had asked the Chromium team to address the crash Slackware users are experiencing. Google is no longer offering 32-bit binaries which means, issues like these are not likely to be caught in their own tests, but they are listening to the packagers who do build 32-bit binaries. Luckily. And the fix took a while to actually get implemented, but in the end it all worked out. I assume that the patch will end up in the Chromium source code after it passes the internal review process.

The Widevine plugin package for which I provided an update, is meant for chromium-ungoogled only. The ‘real’ Chromium does not need or use it, since Chromium downloads this CDM library automatically for you. The change to the package is small: it adds a compatibility symlink. That is not needed for chromium-ungoogled itself, but I was alerted to the fact that Spotify specifically looks for ‘libwidevinecdm.so’ in the toplevel Chromium library directory. The update takes care of that.

Also, this was the last package which i compiled for Chromium that contains my Google API Key as well as the OAuth client/secret credentials. I noticed that Chromium still works as before, even now after the 15 March deadline has passed, but future builds of my package will only contain my API key. That will leave the Safe Browsing functional, but it removes the Chrome Sync and other features. If you still want Chrome Sync to work with Chromium, I just want to point you to “/etc/chromium/01-apikeys.conf” in my future packages and get inspired by its content.

Have fun!
Eric

Sync and share your (Chromium and more) browser data among all your computers

I had promised to write a follow-up article about my journey to find a good alternative for the “Chrome Sync” feature that used to be available in my Chromium packages for Slackware. This is that article. It is fairly large and I hope it is written well enough that you can follow the explanations and examples and are able to implement this yourself. The article is divided into a couple of main sections:

  • Securely syncing your browser’s online passwords to (and retrieving them from) a central location under your control
  • Syncing your browser’s bookmarks to a central location under your control
  • Migrating your Chromium passwords into a central password database under your control
  • Migrating your Chromium bookmarks into a bookmark server under your control.

If you run into anything that is not clear, or does not work for you, please let me know in the comments section!

I’d like to remind you that after 15 March 2021 the Chrome Sync service provided by Google’s cloud infrastructure will only be accessible by Google’s own (partly closed-source) Chrome browser. All other Chromium-based software products are blocked from what Google calls “their private API services”.
All non-Google-controlled 3rd parties providing software binaries based on the Chromium source code are affected, not just my Slackware package. The Chromium browser packages that have been provided in most Linux distros for many years, will lose a prime selling point (being able to sync your browser history, bookmarks and online passwords to the cloud securely) while Google will still be able to follow your browsing activities and monetize that data.
For many people (including myself) using Chromium, the ability to sync your private browser data securely to Google’s cloud and thus get a unified browser experience on all platforms (Linux, Android, Windows, Mac operating systems) certainly outweighed the  potential monetary gain for Google by using a product with its origins in Google – like Chromium. Compiling the sources on Slackware allowed me to deal with some of the privacy concerns. But now the balance has been tipped.

Google took away this Chrome Sync from us Linux Chromium users, and it’s important to come with alternatives.
If you don’t use Chrome or Chromium, then this article is probably not relevant. I will assume you are using Mozilla’s Firefox and that browser has its own open source cloud sync server. You use Mozilla’s own cloud sync or else setup a Sync Server on a machine which is fully under your control. The same is not possible for Chromium browsers unfortunately, although Brave and Vivaldi – two not completely open source Chromium derivatives – have developed their own cloud sync server implementations. But (again unfortunately) those sync solutions are fully entangled with their own browser products.

I already wrote about “un-Googled Chromium“, the browser which is stripped from everything that would attempt to connect to a Google service. The browser that is provided by my “chromium-ungoogled” package is as good as the original “chromium” browser in terms of core functionality and perhaps it is even a tad faster than the original. I urge you to read that article before continuing below.
Because the remainder of this article will deal with: how do you sync, or save, your browser’s online passwords, your bookmarks and your history to a remote server? Using only open source tools? And in such a way that no one will be able to access these data unless you approve it?

Let’s start with a disappointment: there is currently no Open Source solution available that lets you sync your browser history to a cloud server and then access that history in other browsers on other computers. But, one of the products further below (xBrowserSync) does have this feature on its development roadmap – we will just have to be a wee bit patient.
With that out of the way, let’s focus on what is possible.

Syncing your browser’s online passwords

The internet is not a safe place. Sometimes, web sites are hacked and user credentials obtained. If you deal with web sites where you have to authenticate yourself to gain access to certain functionality, that server needs to store your credentials (hopefully encrypted) locally. If you want to prevent that hackers use your credentials to do you harm, it is highly recommended to always use strong passwords, and to use different passwords for every web site that you need to login to.
Remembering all these strong passwords is an impossible task for the average human, which is why we have password managers. A password manager is an application that provides a means to store your credential data in an encrypted database which you unlock with just a single passphrase – and that leaves you with the more easily achievable task to remember one strong password to gain access to all the others. Password managers have become quite user-friendly and some of them integrate with your web-browser.

I present one of those password managers here, KeePassXC. I think it is the best solution to our problem. It is cross-platform (Linux, Windows, MacOS), uses strong encryption, is open source and integrates with Google and Mozilla browsers. Including un-Googled Chromium.
KeePassXC is not a client-server based service though – it all happens on your local computer. In order to make your password database accessible on multiple computers, you will have to store it in such a way that it automatically gets uploaded to (and then kept in sync with) a networked location. If you want to be able to access your passwords while on the move (i.e. outside of your house) the most logical solution is to use a “cloud storage” provider who offers cross-platform sync clients – at a minimum we want Slackware Linux to be supported.

The steps to take:

  1. Ensure that you have a cloud storage client installed on your computers. I’ll give two examples: a client application for the free basic plan of the commercial Dropbox, or for a NextCloud server you operate yourself. There are other solutions of course, use whatever you like most.
    Key here is that you are going to create an encrypted password database somewhere inside the directory tree that you sync to this cloud storage provider.
  2. Download and install chromium-ungoogled. Start it and go through the initial configuration as explained in a previous article so that browser extensions can be installed properly and easily, and ensure that you install the KeepassXC extension from the Web Store. Don’t configure that extension yet.
  3. On your Slackware-current computer, download and install the keepassxc package. Like my chromium-ungoogled package, I have this only available for Slackware-current but in future I intend to add Slackware 14.2 support also for chromium-ungoogled and the sync software.
  4. Start KeePassXC – it is a graphical program so you need to be logged in to a graphical session like Plasma5 or XFCE – and configure it to automatically launch at system startup i.e. when you login (in General > Basic Settings). Configure all other convenience settings to your liking, such as minimizing to system tray, automatically locking the database etc.
  5. Enable browser integration in KeePassXC (Tools > Settings > Browser integration > Enable). In the “General” tab, check the box to at least enable “Chromium” integration. Check any other browser that you would like to use with KeepassXC but for the purposes of this article “Chromium” is the only mandatory choice.
  6. [Only if you use my chromium-ungoogled package instead of regular chromium]: Open the “Advanced” tab and check the box “Use a custom browser configuration location”. Select “Chromium” in the “Browser Type” dropdown. In the “Config Location” entry field you write “~/.config/chromium-ungoogled/NativeMessagingHosts”. This will generate a file “org.keepassxc.keepassxc_browser.json” in that location, specifying the integration parameters between the KeePassXC program and the browser extension.
    You may have to fiddle a bit with that last step, to get KeePassXC to actually create the JSON file. It may be that you have to startup your chromium-ungoogled first and then enable browser integration in KeePassXC.
  7. In KeepassXC you now create a new database (or point it to a database you may already have) in your cloud-sync directory. That database will then sync remotely and will be accessible by KeepassXC instances running on other computers that have the same cloud sync client installed.
    Word of advice: If you do not have a password database already and are starting from scratch, it will be easier to import your old browser’s passwords right now instead of having to merge them later. Look at the end of this article for the section “Importing your passwords from Chromium/Chrome into KeePassXC” which will tell you exactly how to import your old passwords before you continue to the next step!
  8. Now that we have successfully connected KeePassX to its browser extension, your browser will ask you if it should store any new credentials you enter on web sites, into its encrypted KeePassXC database. The KeePassXC extension will automatically add icons to credential entry fields on web pages that allow you to auto-fill these fields if the web site is already known in your database. And if you don’t want to invent strong passwords yourself, password fields can be enhanced with an icon that gives access to the KeePassXC strong random password generator.
  9. Goal achieved. You can now share and use your online passwords on multiple computers. Note that this functionality is not limited to your (un-googled) Chromium but also to the Firefox browsers that you use.

Note: For Android if you installed un-Googled Chromium from F-Droid following these instructions, there’s the compatible KeePass2Android or KeePassDX, both are recommended by the KeepassXC developers, so you can access all your passwords even if there’s no integration with the browser.

Syncing your browser’s bookmarks

For this part, I had already done some preparations. I shared those in a previous post: how to build and configure a MongoDB database server. The best open source bookmark sync server I could find is xBrowserSync, and it uses MongoDB as its database backend.
XBrowserSync is a service you can operate yourself by installing it on a machine you control, but you can also use any of the free online xBrowserSync servers that are operated by volunteers on the internet. Registration to a server is not required, access is anonymous and only you know the sync ID and password which are used to store your data.

I’ll first instruct you how to get the browser-side working with the developers’ freely accessible servers and then I’ll show you how to setup your own sync server.

Setting up your browser

It starts with installing xBrowserSync browser extension into your (un-Googled) Chromium from the Web Store. Important: if you want to use xBrowserSync you need to disable other synchronization mechanisms like Chrome Sync first to prevent interoperability issues. Opening the settings for this extension takes you through a set of introduction messages and then lets you pick a sync server to connect to – the default is the developers’ own server. You then specify a password which will be used to encrypt your data before it gets uploaded. The server will assign you a “Sync ID“, a seemingly random string of 32 hex characters. Write down that Sync ID! It is the combination of the Sync ID and your password that will grant access to your bookmarks if you install the xBrowserSync extension in another browser.
Sync will start immediately.

Setting up your server

Now onto setting up a xBrowserSync server yourself. After all, what’s the fun in using someone else’s services, right? You could have stuck with Chrome otherwise.
To install a xBrowserSync service, you’ll require an online computer which ideally runs 24/7. You can use a desktop or a real server, but it needs to be a 64bit system. The reason? MongoDB, the database backend, only supports 64-bit computers. You won’t be able to compile it on 32-bit.
I run mine from a computer at home, but you can rent a server online too if you would like that better. Assuming here that the computer runs 64-bit Slackware 14.2 or higher, the requirements are:

  • Apache’s webserver “httpd” is installed and running (other web servers like nginx are also possible but outside the scope of this article). Apache httpd is part of a Slackware full installation. In this article we will assume that the URL where you can reach the webserver is https://darkstar.net/ .
  • You have configured your Apache httpd for secure connections using HTTPS and the SSL certificates come from Let’s Encrypt. If this is something you still need to do, read a previous article here on the blog, on how to set that up using the ‘dehydrated‘ script (which was added to Slackware-current since I wrote that post).
  • TCP ports 80 (http) and 443 (https) are open to the Internet (if you want to be able to access your server while on the move). For a server that is running in your own home, you’ll have to port-forward these ports 80 and 443 from your ISP’s modem/router to your own computer.
  • Node.js is installed. You can get a package from my repository.
  • MongoDB is installed and the database server is running. A package is available in my repository and configuration of MongoDB is covered in my previous article.

We start by creating a dedicated user account to run the xBrowserSync service on our computer:

# groupadd xsync
# useradd -g xsync -d /home/xsync -m -s /bin/false xsync

Then clone the xBrowserSync API source repository. My suggestion is to use /usr/local for that and give our new “xsync” user the access permissions to maintain the xBrowserSync files. The ‘#’ depicts root’s shell prompt and ‘$’ is a user shell prompt:

# mkdir /usr/local/xbrowsersync
# chown xsync /usr/local/xbrowsersync
# su - xsync -s /bin/bash
$ cd /usr/local/xbrowsersync
$ git clone https://github.com/xbrowsersync/api.git

Still as user ‘xsync’, install and build the xBrowserSync API package:

$ cd api
$ npm install
$ npm audit fix
$ exit

As ‘root’, configure the log location we will use when the service starts :

# mkdir /var/log/xBrowserSync/
# chown xsync:xsync /var/log/xBrowserSync/

We have xBrowserSync installed, now we need to create the conditions for it to work. So we start with configuring a MongoDB database for xBrowserSync. Below I will use example names for that database, the DB user account and its password. You should of course pick the names of your choosing! So change the bold red texts when talking to the database.
Database: xbrowsersyncdb
DB User: slackadmin
DB Pass: slackpass

Start a mongo shell – it does not matter under which user account you do this as long as you know the MongoDB admin account (mongoadmin in this example) and its password:

$ mongo -u mongoadmin -p
>

Run the following commands in the mongo shell:

use admin
db.createUser({ user: "slackadmin", pwd: "slackpass", roles: [{ role: "readWrite", db: "xbrowsersyncdb" }] })
use xbrowsersyncdb
db.newsynclogs.createIndex({ "expiresAt": 1 }, { expireAfterSeconds: 0 })
db.newsynclogs.createIndex({ "ipAddress": 1 })

With the MongoDB database ready for use, we can proceed with configuring xBrowserSync. Login again to your computer as user ‘xsync’ and make a copy of the default settings file so that we can insert our own custom settings:

$ cd /usr/local/xbrowsersync/api
$ cp config/settings.default.json config/settings.json

In that file “/usr/local/xbrowsersync/api/config/settings.json”, change the following values from their defaults (or at least ensure that the values below are present). I highlighted the MongoDB connection parameters in red:

"status.message": "Welcome to Slackware xBrowserSync"
"db.name": "xbrowsersyncdb"
"db.username": "slackadmin"
"db.password": "slackpass"
"port": "8080"
"location": "your_two_character_country_id"
"log.file.path": "/var/log/xBrowserSync/api.log"
"server.relativePath": "/"

Note that the nodejs server listens at port 8080 by default and I leave it so. But if you are already running some kind of program using that same port number, you can change the port to a different value, say “8888” in the above file – look for “port”: “8080”, .

To operate the new xBrowserSync server, we create a shell script we can start at boot. The commands below show the use of a “here-document” to write a file. If you are unsure how to deal with below commands, check out what a “here-document” actually means.
Still as user ‘xsync’ you run:

$ mkdir ~/bin
$ cat <<EOT > ~/bin/xbs_start.sh
#!/bin/sh
cd /usr/local/xbrowsersync/api
node dist/api.js
EOT
$ chmod 755 ~/bin/xbs_start.sh

This is a simple script to start a node.js server. That server does not move itself to the background, but we need the command to ‘disappear’ into the background if we want our computer to complete its boot sequence. There is a nice way to deal with backgrounding a program that can not run as a background service all by itself: we offer it a ‘virtual console’ and then move that virtual console to the background. A VNC server would be the graphical X analog of such a ‘virtual console’.
Slackware comes with two programs that offer what we need: screen and tmux. I am a screen fan so I’ll use that. As user ‘root’, add this to ‘/etc/rc.d/rc.local’:

# Start the xBrowserSync Server:
if [ -x /home/xsync/bin/xbs_start.sh ]; then
  echo "Starting xBrowserSync Server:  ~xsync/bin/xbs_start.sh"
  su - xsync -s /bin/bash -c "/usr/bin/screen -AL -mdS XSYNC bash -c '/home/xsync/bin/xbs_start.sh'"
fi

This ensures that your xBrowserSync service will start at boot, and the service will run in a detached ‘screen‘ session for user ‘xsync’. If you want to access the running program, you can attach to (and detach from) that running ‘screen’ session from your console or terminal prompt. As user ‘xsync’, simply run:

$ screen -x

… to connect to the virtual terminal, to look at the program output or perhaps even to terminate it with “Ctrl-c“. And if you have seen enough, you can easily detach from the ‘screen’ session again using the “Ctrl-d” key combination, leaving that session to continue running in the background undisturbed.
I will not force you to reboot your computer just to get the service up in the air, so this one time we are going to start it manually. Login as user ‘xsync’ and then run:

$ screen -AL -mdS XSYNC bash -c '/home/xsync/bin/xbs_start.sh'

You will not see a thing but XBrowserSync is now running on your computer.
All that’s left to do now, is to allow connections to the xBrowserSync program by your Chromium or Firefox browser. The program listens at  the loopback address (127.0.0.1 aka localhost) on TCP port 8080, so access is only possible from that same computer. That is a safety measure – no one can access your service from a remote computer by default.
Note that if you changed the default xBrowserSync listen port from 8080 to a different value in “settings.json” above, you need to use that same port number in the Apache configuration block below.

We will configure Apache httpd to act as a reverse proxy – this means that we will deploy the httpd server as the frontend for xBrowserSync. Browsers will never connect to xBrowserSync directly (it is not even accessible from other computers), but instead we let them connect to Apache httpd. The webserver will relay these browser connections to the sync server running on the same machine.
Apache’s httpd is much better at handling a multitude of connections than the nodejs server used for xBrowserSync and we can also let httpd handle the encrypted connections so that we don’t have to complicate the xBrowserSync setup unnecessarily.

In the context of this article, we will operate our xBrowserSync server at the URL https://darkstar.net/xbrowsersync/ – this will be the URL you enter in the browser extension’s configuration instead of the default service URL https://api.xbrowsersync.org/. The bit I highlighted in red will be reflected in our Apache configuration.
There is no need to add anything to your VirtualHost configuration for the HTTP (80) web service, since we want to access the service exclusively over HTTPS (encrypted connections). This is the block of text you have to add to your VirtualHost configuration for the HTTPS (443) web service:

ProxyRequests Off
ProxyVia on
ProxyAddHeaders On
ProxyPreserveHost On

<Proxy *>
  Require all granted
</Proxy>

<Location "/xbrowsersync/">
  # Route incoming HTTP(s) traffic at /xbrowsersync/ to port 8080 of XBrowserSync
  ProxyPass        http://127.0.0.1:8080/
  ProxyPassReverse http://127.0.0.1:8080/
</Location>

Check your Apache configuration before starting the server:

# apachectl configtest
# /etc/rc.d/rc.httpd restart

Check the server status at https://darkstar.net/xbrowsersync/info
A nice GUI view of your service, and also the URL that you need to enter in the browser add-on: https://darkstar.net/xbrowsersync/

Importing your passwords from Chromium/Chrome into KeePassXC

You don’t have to start from scratch with storing web site credentials if you were already saving them in Chromium – whether you were using Chrome Sync or not.
This is how you export all your stored credential data from your Chromium browser and import that into KeePassXC:

  1. In Chromium, open “chrome://settings/passwords” and next to “Saved Passwords” click the triple-dot menu and select “Export Passwords”.
    By default, a comma-separated values (CSV) file will be generated as “~/Documents/Chrome Passwords.csv”. That file contains all your passwords un-encrypted, so be sure to delete that file immediately after the next steps!
  2. If you also save Android app passwords into Google’s password vault and your Chromium browser was using Chrome Sync, you’ll have to clean the above CSV file from all lines that start with the string “,android:”. For instance using a command like this:

    $ sed -e "/^,android:/d" -i ~/Documents/Chrome\ Passwords.csv

  3. Open the KeePassXC program on your computer. We are going to import the CSV file containing the credentials we just exported from Chromium via the menu “Database > Import > CSV File”. The first thing to do is to pick your new database’s format and encryption parameters. The default values are perfectly OK. Then define the password that is going to protect your database, and the location to store that database.
    • If this is actually going to be the database you intend to use with your browser, then select a directory that is being synchronized with an external cloud storage location.
    • If you already have a password database you are using in your browser but you want to import your old Chromium passwords after the fact, then the location of this database does not matter since it is only a temporary one and can be deleted after merging its contents with the real password database. So in this case, anywhere inside your homedirectory is just fine.
  4. An import wizard will then guide you through the process of associating the CSV data to matching database fields.
    One picture says more than a thousand words here, so this is what you should aim for to match the columns in the CSV file (name, user, password, url) to the corresponding fields in the KeePassXC database (Title, Username, Password,  URL):
    The trick here is to check the box “First line has field names” and then select: “name” in the Title dropdown; “username” in the Username dropdown; “password” in the Password dropdown and “url” in the URL dropdown. The dropdown for Group will show “name” as well but you need to select “Not Present” there and also ensure that all the other dropdowns are showing “Not Present”.
  5. After importing has completed and you have validated that the CSV values (name, user, password and url) are indeed imported into the correct database fields, the final question: is this your passwords database to use with the browser plugin?
    • If yes, then you are done!
    • If no, then we need to merge the database we just created into your real passwords database.
      Open your real passwords database in KeePassXC and then select the menu “Database > Merge from Database…”. You get the chance to select your temporary database you just created from the imported CSV data. Enter that database’s password to unlock it and its data will immediately be merged. You can now safely delete the temporary database where you imported that CSV data.
  6. You must now delete the CSV file containing all your clear-text passwords. It is no longer needed.
  7. That’s all. Good luck with using your passwords across devices and browsers safely and under your exclusive control.

Importing your browser bookmarks into XBrowserSync

Like with your passwords above, there’s also an easy way to import the bookmarks you were using in Chromium into XBrowserSync so that you can make these available in un-Googled Chromium. Remember that Chromium’s use of Chrome Sync conflicts with the xBrowserSync extension. You can of course first disable that Chrome Sync (and after 15 March 2021 you should do that anyway) and then enable xBrowserSync, but the procedure below will likely also work when you want to export bookmarks from another browser not supported by xBrowserSync.

  1. In Chromium, open the bookmark manager: “chrome://bookmarks/
  2. Click on the triple-dot menu in the top-right and select “Export bookmarks”. Save the resulting HTML file somewhere, it is temporary and can be deleted after importing its contents into un-Googled Chromium.
  3. Open the resulting HTML file into an editor. The “Kate” editor in KDE Plasma can deal pretty well with the HTML and breaks it down for you into sections you can easily collapse, expand or remove. You need to remove the “Apps” bookmark section in the HTML which is not useful for the un-Googled version, and perhaps you want to remove some other stuff as well.
  4. In un-Googled Chromium, open “chrome://bookmarks/” and this time select “Import bookmarks” from the triple-dot menu in the top-right corner. Pick the HTML file you exported from Chromium and the imported content will show up in your Bookmarks Bar as a bookmark-menu called “Imported”.
  5. You’ll probably want to move stuff out of that “Imported” menu and into the locations where you previously had them in your old Chromium’s  Bookmarks Bar. You can do that most easily from within the bookmark manager by right-clicking stuff, selecting “Cut” and then pasting them into another location on the Bookmarks Bar.
    One caveat: I found out empirically that the xBrowserSync extension does not like it when it is still syncing a change in your bookmarks, and then you make another change before it is finished. It will simply invalidate the sync and try to revert to the previously known bookmark list. The thing is, if you did a “Cut” in the meantime with the intention to “Paste” a set of bookmarks somewhere else, that set will be gone forever. Best is to keep an eye on the xBrowserSync extension’s icon next to the URL entry field and do not try to make changes while it is still in the process of syncing. If you keep this in mind, then the experience will be just great.
  6. All done! Enjoy the fact that the bookmarks you create or modify in one browser, will be immediately accessible in other connected browsers.

This concludes a lengthy article, I hope it contains information you can make good use of. In the days and weeks to come, we’ll see what happens with the original Chromium and whether un-Googled Chromium can take its place. Let me know your experiences.

Have fun! Eric

« Older posts Newer posts »

© 2024 Alien Pastures

Theme by Anders NorenUp ↑