RSS Atom Add a new post titled:

How to install DWSIM 8.0.4 on Debian 10

Created by Steven Baltakatei Sandoval on 2022-07-27T21:23Z under a CC BY-SA 4.0 license and last updated on 2022-07-27T22:44Z.



These are instructions for installation DWSIM 8.0.4 onto a Debian 10 machine with amd64 CPU architecture (e.g. Intel i9 or AMD Ryzen).


DWSIM is an open source chemical process simulator produced under a GPLv3 license. Although its full-featured version is available only for Windows due to most CAPE-OPEN modules usually intended to be run in Windows, the main developer Daniel Medeiros compiles and publishes a version of DWSIM that can be run in Debian 10). The following instructions may apply to other Debian-derived distributions such as Ubuntu, but some customizations may be rqeuired (e.g. a different Mono Stable repository is required between Ubuntu, Debian)


Download DWSIM Debian Installer Package

Download the DWSIM Debian Installer Package ( .deb) file from the website. You should get a file with a name resembling this:

  1. Don't do this

    Note, if you try to install the .deb file via the usual $ sudo dpkg -i package.deb trick, you will get the following errors, telling you about missing dependencies (read on to solve them):

    $ sudo dpkg -i dwsim_8.0.4-amd64.deb
    [sudo] password for baltakatei: 
    Selecting previously unselected package dwsim.
    (Reading database ... 237740 files and directories currently installed.)
    Preparing to unpack dwsim_8.0.4-amd64.deb ...
    Unpacking dwsim (8.0.4) ...
    dpkg: dependency problems prevent configuration of dwsim:
     dwsim depends on mono-complete (>= 6.8); however:
      Package mono-complete is not installed.
     dwsim depends on mono-vbnc (>= 4.0); however:
      Package mono-vbnc is not installed.
     dwsim depends on gtk-sharp2 (>= 2.12); however:
      Package gtk-sharp2 is not installed.
     dwsim depends on libfontconfig1-dev; however:
      Package libfontconfig1-dev is not installed.
     dwsim depends on coinor-libipopt1v5; however:
      Package coinor-libipopt1v5 is not installed.
    dpkg: error processing package dwsim (--install):
     dependency problems - leaving unconfigured
    Errors were encountered while processing:

Install DWSIM

With the .deb file downloaded, from here it is possible to use a script (link) I wrote to perform the remaining steps.

However, I will explain the actions it performs in case you wish to do them manually.

  1. Enable Mono Stable's Debian 10 repository

    The DWSIM .deb file requires the mono-complete package (version >=6.8) that is available in a non-standard repository hosted by the Mono Project. To configure Debian 10 to download package lists and updates from this repository, run the following commands (taken from the Mono Project's download page):

    sudo apt install apt-transport-https dirmngr gnupg ca-certificates
    sudo apt-key adv --keyserver hkp:// --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
    echo "deb stable-buster main" | sudo tee /etc/apt/sources.list.d/mono-official-stable.list
    sudo apt update

    These commands tell the operating system to trust software produced by the Mono Project and tells the system where to get future updates via apt.

  2. Install DWSIM dependencies

    Now that the operating system knows about the Mono repository, install software from Mono:

    $ sudo apt install mono-complete
    $ sudo apt install mono-vbnc gtk-sharp2 libfontconfig1-dev coinor-libipopt1v5
  3. Install DWSIM

    Now othat the Mono dependencies are installed, dpkg can be used to manually install the .deb file:

    $ sudo dpkg -i dwsim_8.0.4-amd64.deb

    Note: if you run this command without the required dependencies being available, you may need to explore using $ sudo apt -f install and related commands to fix things.


To run DWSIM, you can run dwsim from the command-line.

$ dwsim

In order to test that DWSIM is working, here is a DWSIM 8.0.4 project file that you can download and open. It is the simple gas compressor. A screenshot of the process flow diagram can be downloaded here.

Posted 2022-07-27T22:18:35+0000

GNOME Extension: Panel Date Format

Created by Steven Baltakatei Sandoval on 2022-04-28T21:03Z under a CC BY-SA 4.0 license and last updated on 2022-04-28T22:08Z.


I enjoy keeping track of the time using ISO-8601 format. It permits unambiguous big endian time indication in a sortable format. I believe the default time format used in the top bar in Pop!OS 22 (which is basically Ubuntu 22) depends upon which region the machine is configured for (e.g. "United States", or "Germany", etc.).

My default date time format is something like Apr 28 2022, 21:44 (I'm not sure because I don't care ^_^). What I really want is an ISO-8601 date time format like 2022-04-28T21:45:33+0000. The +0000 explicitly states the time zone my machine is configured for (UTC in this case); this is useful in case I want to communicate the time to someone unambiguously; knowing the time zone is required to do so. Also, time units are sorted in descending fashion (year, month, day, hour, minute, second) instead of the tradition-bound madness that is US date time format (month, day, year, hour, minute, second).

One GNOME Extension that I found works for me in Pop!OS 22 is "Panel Date Format" (GitHub).


  • Install the GNOME Shell integration add-on for Firefox.

  • Go to the Panel Date Format extension page.

    • Enable using the toggle switch on the top right of the page.
  • Go to the GitHub repository for the extension.

    • Copy the dconf command into a text editor.
  • Modify the dconf command's date format string from "'%Y-%m-%d'" to "'%Y-%m-%dT%H:%M:%S%z'"

    • The command I used is:

      ~$ dconf write /org/gnome/shell/extensions/panel-date-format/format "'%Y-%m-%dT%H:%M:%S%z'"~

  • Verify the change is loaded by running $ dconf dump / > dconf_dump.txt and searching this .txt file for a part labelled [org/gnome/shell/extensions/panel-date-format]. (you can also use the dump and load commands to automatically backup settings that use the dconf database; I use yadm's bootstrap function to do this automatically when automatically setting up a new Debian machine).

The date time indicator on the top bar should now look like this:



Using GNOME Extensions and a command in the command line, it is possible to customize the top bar date time indicator to use ISO-8601 formatting in Pop!OS 22.

Posted 2022-04-28T22:09:08+0000

Using Open Timestamps with Git

Created by Steven Baltakatei Sandoval on 2022-04-22T15:00Z under a CC BY-SA 4.0 license and last updated on 2022-04-23T13:56Z.


I learned how to configure git to automatically create Open Timestamps proofs whenever I sign a commit with my OpenPGP key.


Although Bitcoin is mostly popular for its use as digital money, in order for it to properly function in a decentralized manner, it must also act as a timestamping service to itself. Bitcoin does this by requiring miners to mark each candidate block they produce with a timestamp; accuracy is enforced by some rules that cause a block to be rejected. These rules can be roughly summarized as:

  1. The timestamp must be in the future (Median Past Time rule)
  2. The timestamp must not be too far into the future (Future Block Time rule)

Therefore, provided that the majority of miners are honest (an assumption Bitcoin already requires), the most recent block will likely have a timestamp that can be expected to be accurate to within a few hours.

OpenTimestamps is a timestamping service and set of programs that relies upon this internal timestamping feature of the Bitcoin blockchain. Open Timestamps was created by Peter Todd, a former Bitcoin Core developer. I believe I first heard about Open Timestamps (OTS) while following discussions on the /r/Bitcoin subreddit (possibly from this thread).

The timestamp service works by having a "calendar" server program collect file hashes submitted from client programs at the direction of some users. The hash of each user's file(s) is integrated into a merkle tree by the server. Then, periodically, the calendar server will create and submit a Bitcoin transaction containing the tree's merkle root. Once the transaction is included in the Bitcoin blockchain, the calendar server can then provide a client with the merkle branch leading from the merkle root to the file hash the client submitted; the merkle branch is typically encoded in an .ots file created next to the hashed file; this .ots file permits the client, armed with a copy of the blockchain, to verify that the hashed file existed at least as far back as the timestamp of the block containing the merkle root.

Using OpenTimestamps (OTS)

It is possible to use the website to create the .ots file using only a web browser. See the "STAMP & VERIFY" section of the main webpage.

However, client software is also provided in the Python, Javascript, and Java programming languages. Personally, I used the Python implementation which the examples later in this blog post will reference.


On a fresh Debian-based system, the Python implementation of OTS can be installed via:

$ sudo apt install python3-pip
$ pip3 install opentimestamps-client

The ots and ots-git-gpg-wrapper (to be used later) executables then should be added to the PATH environment variable. This can be done by adding this code to your $HOME/.profile file (or whichever file you use to automatically load custom environment variables; e.g. $HOME/.bashrc):

# set PATH so it includes user's private bin if it exists                                                                                  
if [ -d "$HOME/.local/bin" ] ; then                                                                                                        
    export PATH="$HOME/.local/bin:$PATH"                                                                                                   

Running source ~/.profile (or whichever sh file contains the above code) will then modify PATH. Environment variables set in your current shell can be found by running $ printenv.


The command line interface for OTS version 0.7.0 works like this:

You want to timestamp a file named You do can do this by running:

$ ots stamp

This will create a file adjacent to

However, this .ots file does not contain the full merkle branch data necessary, meaning the calendar server(s) used by ots (several are configured by default) will have to be contacted in order to upgrade the .ots file.

An .ots file can be upgraded via:

$ ots upgrade

If you are okay with the command waiting the (typically) several hours required for a calendar server to provide merkle branch data before exiting, then you can run this from the start:

$ ots --wait stamp

A .ots can be verified by running:

$ ots verify

In any case that results in downloading a proof from a calendar server, ots will save a cache of the merkle branch data in the $HOME/.cache/ots directory. ots looks for proof data here before contacting a calendar server. Also, I suspect that because ots only ever adds files to $HOME/.cache/ots and never modifies files there, multiple machines running the same version of ots can use a file synchronization tool such as syncthing to merge their OTS caches for future reference.

I have used this procedure to timestamp some of my own files for my own amusement and reference.

OTS Git Wrapper

The reason I created this blog post, aside from documenting my own usage of ots, was to document how I started using the ots-git-gpg-wrapper feature.

git (distributed version control software I use in this blog, my Notable Public Keys book, and other projects) can be configured to use ots to improve signed commits. The OTS documentation for this feature can be found within the OTS GitHub repository.

  1. Typical Usage

    To summarize, the feature can be activated for all git repositories by running:

    $ path_wrapper="$HOME/.local/share/ots/"
    $ git config --global gpg.program "$path_wrapper"

    If you want to only activate this feature for a single repository, change the working directory to the repository, replace --global to --local, then run the command, like so:

    $ cd $HOME/some_git_repo
    $ path_wrapper="$HOME/.local/share/ots/"
    $ git config --local gpg.program "$path_wrapper"

    The default contents of are:

    # Wrapper for the ots-git-gpg-wrapper
    # Required because git's gpg.program option doesn't allow you to set command
    # line options; see the doc/
    ots-git-gpg-wrapper --gpg-program "`which gpg`" -- "$@"
  2. Advanced Usage

    To go further, I modified the original wrapper script to contain:

    # Wrapper for the ots-git-gpg-wrapper
    # Required because git's gpg.program option doesn't allow you to set command
    # line options; see the doc/
    # Check if gpg is alias. (see )
    if alias gpg 2>/dev/null; then
        ## Get gpg alias command
        gpg_cmd="$(type gpg)"; # get raw alias definition
        gpg_cmd="${gpg_cmd#*=\'}"; # trim chars before and including first apostrophe
        gpg_cmd="${gpg_cmd%\'*}"; # trim chars after and including last apostrophe
        gpg_cmd="$(which gpg)";
    # Check if jsonrpc_option file available
    if [ -f "$path_jsonrpc_option" ]; then
        jsonrpc_option="$(cat "$path_jsonrpc_option" | head -n1)";
    eval "ots-git-gpg-wrapper $jsonrpc_option --gpg-program $gpg_cmd -- $@"

    To add the --wait option when running ots-git-gpg-wrapper, I copied to an adjacent file named containing:

    # Wrapper for the ots-git-gpg-wrapper
    # Required because git's gpg.program option doesn't allow you to set command
    # line options; see the doc/
    # Check if gpg is alias. (see )
    if alias gpg 2>/dev/null; then
        ## Get gpg alias command
        gpg_cmd="$(type gpg)"; # get raw alias definition
        gpg_cmd="${gpg_cmd#*=\'}"; # trim chars before and including first apostrophe
        gpg_cmd="${gpg_cmd%\'*}"; # trim chars after and including last apostrophe
        gpg_cmd="$(which gpg)";
    # Check if jsonrpc_option file available
    if [ -f "$path_jsonrpc_option" ]; then
        jsonrpc_option="$(cat "$path_jsonrpc_option" | head -n1)";
    eval "ots-git-gpg-wrapper --wait $jsonrpc_option --gpg-program $gpg_cmd -- $@"

    Part of these modifications take into account the fact that I use the alias function to include custom options whenever I run gpg.

    Additionally, my modifications include permitting ots commands run from my main GNU/Linux Debian-based workstation that is NOT a bitcoin node to connect to a bitcoin node using the --bitcoin-node option for the ots-git-gpg-wrapper command. I do this by loading into a variable the first line of a file named jsonrpc_option.txt that I maintain alongside the file. The first line of the jsonrpc_option.txt file contains information used to permit connection to a bitcoin node (a Raspiblitz in my case) on the local network at IP address via the RPC interface on port 8332 using username raspibolt and password hunter2. The file contents take the form:

    --bitcoin-node http://raspibolt:hunter2@

    With those changes in place and the wrapper script activated via a git config command mentioned earlier, if I create a signed commit and then run git log, I will see:

    commit 626303d06805ee6cd50e231da6988fc0235bd8e8 (HEAD -> master)
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Could not verify timestamp!
    gpg: Signature made Fri 22 Apr 2022 05:58:35 PM GMT
    gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
    gpg: Good signature from "Steven Sandoval <>" [ultimate]
    gpg:                 aka "Steven Sandoval <>" [ultimate]
    gpg:                 aka "[jpeg image of size 1846]" [ultimate]
    Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
         Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
    Author: Steven Baltakatei Sandoval <>
    Date:   2022-04-22T17:57:30+00:00
        draft(posts:20220422):using ots with git
        - note: testing timestamp feature with example commit

    After three hours, this message changed to:

    commit 626303d06805ee6cd50e231da6988fc0235bd8e8 (HEAD -> master)
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Calendar Pending confirmation in Bitcoin blockchain
    ots: Got 1 attestation(s) from
    ots: Success! Bitcoin block 733047 attests existence as of 2022-04-22 GMT
    ots: Good timestamp
    gpg: Signature made Fri 22 Apr 2022 05:58:35 PM GMT
    gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
    gpg: Good signature from "Steven Sandoval <>" [ultimate]
    gpg:                 aka "Steven Sandoval <>" [ultimate]
    gpg:                 aka "[jpeg image of size 1846]" [ultimate]
    Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
         Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
    Author: Steven Baltakatei Sandoval <>
    Date:   2022-04-22T17:57:30+00:00
        draft(posts:20220422):using ots with git
        - note: testing timestamp feature with example commit

    The line that matters is the one mentioning block 733047 (probably this transaction, based on the current state of the "bob" calendar server).

    ots: Success! Bitcoin block 733047 attests existence as of 2022-04-22 GMT

    If I run git log another time, the ots: lines change to indicate the local cache is being used to verify instead of a calendar server.

    commit 626303d06805ee6cd50e231da6988fc0235bd8e8 (HEAD -> master)
    ots: Got 1 attestation(s) from cache
    ots: Success! Bitcoin block 733047 attests existence as of 2022-04-22 GMT
    ots: Good timestamp
    gpg: Signature made Fri 22 Apr 2022 05:58:35 PM GMT
    gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
    gpg: Good signature from "Steven Sandoval <>" [ultimate]
    gpg:                 aka "Steven Sandoval <>" [ultimate]
    gpg:                 aka "[jpeg image of size 1846]" [ultimate]
    Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
         Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
    Author: Steven Baltakatei Sandoval <>
    Date:   2022-04-22T17:57:30+00:00
        draft(posts:20220422):using ots with git
        - note: testing timestamp feature with example commit

    If I had instead activated, then the git commit operation would wait until it received a proof from a calendar server before exiting. This may be useful for for major signed commits and tags where it is desirable to transmit the proof via the git repository itself instead of relying upon a calendar server or the cache files being available in the future. I don't know of a method to upgrade the OTS proof data already included in a git repository.

Use Cases

.ots files

Generating .ots files is easier than messing with the OTS git wrapper script since no knowledge of git is required. For this reason, I imagine timestamping files via Python or the website would be more common.

Some use cases I can imagine are:

  • A reporter timestamping .eml files containing controversial correspondence.
  • A lawyer timestamping controversial secret .pdf documents that will be revealed at a later time.
  • A civil engineer timestamping a receipt of a .pdf report they create for a building owner about the need to perform expensive repairs.
  • A business owner timestamping a contract encoded in a .pdf to help prove when an agreement was made.
  • A speedrunner wanting to prove that a world record they recorded in a .mkv video file was made by a certain date in the past.
  • A software developer wanting to prove that certain archived versions of their software existed as early as a certain date.

OTS git wrapper

In the abstract, the OTS git wrapper allows me to prove the existence of files in a git repository that I already am using OpenPGP to sign. This is useful for avoiding ambiguities associated with the possibility of some attacker capturing my OpenPGP private key and signing files with forged timestamps in order to try and convince someone to believe their revised history of some event.

Although I personally am not in the business of proving the existence of files for others, I can imagine some software developers who use git who may want to prove who had created a piece of code before a copycat.

Also, git, because it is version control software for file trees, can be used to prove the existence of many files at a time (albeit via SHA1 which, as of 2022, is the best method git supports for hashing data). This may be useful if the number of proofs I need to create is large enough to saturate a significant fraction of public calendar server bandwidth and storage capacities.


An OTS timestamp can only prove data existed after a certain date. It cannot prove exactly when data was created.

Also, an attacker could pregenerate a large fraction of all possible permutations of data and create an OTS proof for each permutation. For example, if I wanted to bamboozle people into thinking I could predict the weather anywhere in the world 8 years in advance, I could create a large number of text files containing permutations of the general prediction "I predict in 2022 that on [date in 2030], in [location], it will be [weather type]." I would then create .ots proof files for each prediction and sit on them for 8 years. Then, on the specified date in 2030, I would selectively reveal only the predictions that happened to be true.


OpenTimestamps can be used to prove the existence of files and git commits in a scalable manner using the Bitcoin blockchain.

Posted 2022-04-25T00:12:36+0000

Notable Public Keys book update (F-Droid, Qubes OS)

Created by Steven Baltakatei Sandoval on 2022-04-11T14:23Z under a CC BY-SA 4.0 license and last updated on 2022-04-12T01:18Z.

I updated the Pubkeys book (PDF, Gitlab).

New Sections


I added the Android app FOSS repository that I use. The developers publish OpenPGP signatures for the downloadable APK of the F-Droid client, a package manager similar to the the Google Play Store app. I had started creating a chapter months ago but hadn't fleshed it out yet due to slow loading times of the F-Droid main website.

From what I can tell, F-Droid didn't publish a OpenPGP signature for their main client's APK file until late 2017. Also, there's no obvious place to download the public key (41E7 044E 1DBA 2E89).

Qubes OS

I saw a Reddit post from someone looking to install Qubes OS and was looking for a method to verify PGP software to verify a Qubes OS installation file. Although I've not yet started a GnuPG chapter (I imagine it will contain a lot of history), I decided I could draft a Qubes OS chapter. Such a chapter would contain the fingerprints necessary to reassure someone looking to download and install Qubes OS for themselves.

I was surprised by how detailed the PGP verification instructions available on the main website were. They have a master key (DDFA 1A3E 3687 9494) that signs each release key; there is one release key per major version (e.g. v1.0, v2.0, etc.).

Future sections

Here's the short list of upcoming chapters I plan on writing next when the fancy strikes me.

  • GnuPG. Software often used to perform OpenPGP operations.
  • LibreOffice. An alternative to Microsoft Office.
  • Red Hat. A popular commercial GNU/Linux operating system / service company I don't have much experience with but which uses OpenPGP signatures to sign software.
Posted 2022-04-11T15:22:22+0000

TeXmacs: Thermodynamics and Chemistry by Howard DeVoe - done

Created by Steven Baltakatei Sandoval on 2022-03-17T02:21Z under a CC BY-SA 4.0 license and last updated on 2022-03-20T18:18Z.


Thermodynamics and Chemistry by Howard DeVoe, Version 10, Transcribed to TeXmacs. It's done. See the repository for the source code. Compare to the original compiled from LaTeX here.

This past week I've made an effort to finish off the Bibliography and citations of the thermodynamics textbook transcription job I assigned myself over a year ago.

Bibliography wrangling

I had to figure out how exactly TeXmacs processes bibtex data. It's complicated and usage isn't well-documented but I found a way that works (when inserting an automatically-updating bibliography section via "Insert -> Automatic -> Bibliography", it is possible to specify a file as well as a bibliography style if you examine the <bibliography> (?) tag that is created; the file must be a bibtex file; then, a tag like <cite|author-1999> will trigger the bibliography section to read the file and populate itself with a new bibliography entry and assign the <cite> tag a reference number).

One of DeVoe's bibtex entries kept causing crashes in TeXmacs whenever I tried to update the bibliography. It was the only one of its kind.

   Author = {Clapeyron, \'{E}.},
   Title = {Memoir on the Motive Power of Heat},
   BookTitle = {Reflections on the Motive Power of Fire: Sadi Carnot and other Papers on the Second Law of Thermodynamics by \'{E}.\ Clapeyron and R.\ Clausius},
   Editor = {Mendoza, E.},
   Publisher = {Dover},
   Address = {Mineola, New York},
   Pages = {71-105},
   Note = {Translation by E.\ Mendoza of ``M\'{e}moire sur la Puissance Motrice de la Chaleur,'' \emph{J.\ de l'\'{E}cole Polytechnique}, \textbf {14}, 153--190 (1834).},
 Year = {1988} }

The problem is with the Note value. There's a lot of what looks like LaTeX code (which I'm not familiar with). I'm using the tm-plain style which seems to be okay with processing some LaTeX'isms in these bibliographical entries (a lot of the markup seems to be used to effect diacritical marks which could be handled by use of UTF-8 characters, but I wasn't about to go down the rabbit hole of figuring out how TeXmacs handles Unicode points in its bibliography processing code). I switched out the Note value to something simpler (that references the title's OCLC code) and now compiles (clapeyron-1834 was used in a citation in

Note = {See OCLC 559435201; 153--190.},

Two columns in one column document

I also figured out that its possible to get two-column pages in an otherwise single-column TeXmacs document. Basically, you create a two-column page (single page only) in one TeXmacs document (.tm) then <include> it in another document (via "Insert -> Link -> Include"). I'm pretty sure this is stretching what the original developer intended but it seems to work well enough for the single-page biographical sketches that DeVoe included in his book.

Disappearing Dotted Line

A few months into the transcription project, I emailed Howard DeVoe a PDF of a draft. He noted that some figures had dashed lines that changed depending on how far you were zoomed in. Eventually I concluded the cause of that problem was some error / wrong assumption made by either Inkscape, ghostscript, or some converter in between the source Encapsulated Post Script (EPS or .eps) file that DeVoe had emailed me and the .eps file that I got to work when inserted into the TeXmacs documents. Instead of trying to hunt down the cause, I decided to try and fix each issue in each .svg file that Inkscape could produce from each .eps file.

SVG is an XML-like format that I can easily parse with grep, sed, and other command line tools that I'm familiar with; in contrast, EPS is compiled output not meant for direct editing (perhaps decades ago it was meant to be human-editable but SVG seems to be the standard now). So, I converted every EPS file I received from DeVoe into an SVG master file from which I would export EPS or whatever format I needed to make the book work (see I found that each image with the dynamically chnanging dashed line issue also had a "stroke-dasharray" key value pair; so, I wrote (in ref/notes/baltakatei/bash/ in the repository), a script to convert each dashed path object into a path object in which each dash is a subpath; it greatly increases the amount of information requried to define the dashed line but the benefit is that the dashed line appears the same no matter the zoom level; in other words, every single dash is converted into a static object instead of letting whatever vector renderer decide how to draw a dashed line.

While I was patting myself on the back for figuring out how to automatically get Inkscape to process the many SVG files I had, I noticed that some of the images lost some dotted lines. I eventually discovered that some dashed lines had dashes with dash lengths specified so short (zero length), that the Inkscape extension I was using (org.inkscape.filter.dashit.noprefs) was encountering a "divide by zero" error and deleting the path partially or entirely.

Soooooo, I had to write to run before I ran would change all the 0's in each stroke-dasharray to something small. I also filed a bug report (which led to a fix that was merged by a Jonathan Neuhauser; thanks!). Testing showed me that a value too close to zero would be the same as if zero were still present. I had to use a relative value of nearby non-zero values in the stroke-dasharray and my code is super brittle (it probably wouldn't work for all dashed paths encoding all possible morse code sequences, but my problem was just with simple dotted lines disappearing). … — …

I debugged the scripts so they'd do what I wanted (fix the dashed paths into paths with static subpaths for each dash/dot). I definitely couldn't do what I did with earlier versions of Inkscape; I downloaded and ran the latest Inkscape version (inkscape 1.2); its command line interface is super slick (although it would be nice if there were a sleep command to let me get a peek at changes).

Long story short, I think I've resolved the dashed line issue, Professor DeVoe. ^^;;

Next steps

The core reason I transcribed the textbook in the first place was so that when I completed the problem sets for my own review and education, I'd be able to have a digital document that I could link to and share with others. That means the next step, now that transcription is done, is to create my own solutions to the problem sets and record them in TeXmacs.

This means I will probably need to use DWSIM more in order to cross-check my thermodynamic calculations and create illustrations / diagrams relevant to each solution. Again, I want to be able to hand someone a ZIP file containing everything they need to learn Chemical Engineering; that means not only textbooks and problem sets, but also simulation software that they might use when applying this knowledge in industry. A ZIP file that I could in good conscience send to the people on the ISS should everyone on the planet's surface be doomed.

Posted 2022-03-17T03:48:10+0000

Notable Public Keys book update (Electrum, Tails, Veracrypt)

Created by Steven Baltakatei Sandoval on 2022-03-12T12:52Z under a CC BY-SA 4.0 license and last updated on 2022-03-12T13:18Z.

I added three sections to the Notable Public Keys book (PDF, GitLab):

New Sections


I added a few of the Tails keys and did some research about the origin of Tails. I didn't know that its name (i.e. "The (Amnesic) Incognito Live System") was a hint at the fact that there used to be two projects that merged together. One was called "Incognito" (a privacy-focused operating system based on Gentoo) and the other was "Amnesia" (also a privacy-focused operating system based on Debian). See the PDF for links and notes.

There are many more Tails keys covering various subgroups of Tails development operations (PR, financial, etc.). I've elected to only link to the page containing these fingerprints and focusing on the fingerprints necessary for verifying downloadable installation .iso images (funnily enough, their mailing list key used to also be an image signing key very early on so I include it).


This is software I had learned about from Steve Gibson's Security Now podcast. I've used it occasionally over the years; it was one of my first entries in my OpenPGP fingerprint notes that later became the Notable Public Keys book. I guess the reason I haven't added it to the book until today was because I was still using my unpublished notes as a source of fingerprints. It's time to share what I know!


Years ago, I uploaded a Youtube video about how to verify the Electrum installer for Windows 10. It's probably now out-of-date but the verification procedure still relies on using Thomas Voegtlin's fingerprint. Lately the download page has linked to two other public keys; one seems to be an ElectrumX developer.

The cost of failure for not checking signatures of this software grows each year as Bitcoin does what it does best: persist. It's my hope that this chapter of the book can help prevent people from losing money, if only by providing a persistent reference through time about which fingerprint to trust.


I made various updates to the <index> tags so that the Index section can be more organized when TeXmacs generates it. This work actually probably took about as along as researching the histories of Tails and Veracrypt combined but involved many more keypresses due to needing to check each existing chapter's tags.

Future Sections

The next chapters I plan to write are below in descending order of priority.

  • GnuPG. Software often used to perform OpenPGP operations.
  • F-Droid. An alternative to the Google Play store.
  • LibreOffice. An alternative to Microsoft Office.
  • Red Hat. A popular commercial GNU/Linux operating system / service company I don't have much experience with but which uses OpenPGP signatures to sign software.
Posted 2022-03-12T13:24:30+0000

Notable Public Keys book update (Tor Browser, Youtube-dl)

Created by Steven Baltakatei Sandoval on 2022-03-09T18:13Z under a CC BY-SA 4.0 license and last updated on 2022-03-10T01:45Z.

I added two sections to the Notable Public Keys book (PDF, GitLab):

New Sections

Tor Browser

Tor Browser, along with Tails, is definitely one of the earliest programs I used that prominently featured verification methods for its installation executables. It also was one of the more prominent victims of a certificate spamming attack years ago. These details I made sure to include in the chapter I wrote. Although several PGP keys are mentioned in various docs, only a single key seems to continuously be used to sign release executables.


Although Youtube-dl, a Python2 project, seems to have become idle compared to its Python3 fork (See yt-dlp), it uses OpenPGP keys to sign releases and its GitHub project still sees some occasional updates. The fork, "yt-dlp" doesn't seem to use OpenPGP signatures on release files; however, some of its developers that worked on Youtube-dl do sign commits with OpenPGP so I mentioned their public key fingerprints so my new script can save a copy of their public keys in the book repository's ref/pgp_keys/ directory.


Bitcoin Core

I noticed this week that Bitcoin Core changed the way it signs binary releases. Specifically, last year around 2021-09, its Download page began linking to a signature file (SHA256SUMS.asc) separately from its release hash file (SHA256SUMS). Before, both the hashes and the signature were contained within the same file (SHA256SUMS.asc). This change was made upon release of Bitcoin Core v0.22.0 in order to accommodate the ability for multiple people (besides Wladimir J. van der Laan) to sign the binary release files. I count 12 signatures in the SHA256SUMS.asc file for the v0.22.0 release, none of which are van der Laan's project signing key (90C8 019E 36C2 E964) which has been used to sign v0.11.0 through v0.21.2. Instead, van der Laan's signature was generated from his personal key.

This new method of signing releases makes sense to me if multiple groups wish for their own representative to personally review the code and sign off on it; instead of a group needing to figure out if they can trust van der Laan, they can more simply trust their group's representative.


I wrote a bash script to automatically scan the book's source code for strings resembling gpg fingerprints and then checking to see if any public keys matched. Then, the script exports a minimal (non third-party signatures) ASCII-armored version of the public key to a target directory using the full 40-character fingerprint in the file name. I have been meaning to include minimal copies of all public keys I mention in the book in the repository. This script lets me do that. Since I intend the book to be able to be printed out onto paper, I do not plan on including the actual public keys themselves in the book since some of them can be quite large even in minimal format; I don't want to have to write code to strip out extraneous UIDs.

After I wrote and tested, I realized it wouldn't be too much extra work to adapt a fork of the script to scan webpages for things resembling gpg fingerprints. Several of the projects for which I have written chapters post the fingerprints used to sign files on centralized download pages that are kept updated; whenever their PGP key changes, they update the fingerprint. This script can be a handy tool to quickly identify public keys that I may be missing from my collection when I occasionally sit down to update chapters of the book.

Future Sections

The next chapters I plan to write are below in descending order of priority.

  • Electrum. A bitcoin wallet. I believe including this in the book will help prevent some people losing serious amounts of money over time.
  • TailsOS. A privacy-focused GNU/Linux operating system that uses tor for all communication. I believe including this in the book will help give journalists and activists more confidence when tackling the intimidating process of installing Tails for the first time.
  • VeraCrypt. A filesystem encryption program. An audited successor to TrueCrypt.
  • F-Droid. An alternative to the Google Play store.
  • LibreOffice. An alternative to Microsoft Office.
  • Red Hat. A popular commercial GNU/Linux operating system / service company I don't have much experience with but which uses OpenPGP signatures to sign software.
Posted 2022-03-09T18:52:00+0000

Notable Public Keys book update (Debian, Satoshi Labs)

Created by Steven Baltakatei Sandoval on 2022-01-03T12:58Z under a CC BY-SA 4.0 license and last updated on 2022-01-03T13:26Z.

I added two sections to my Notable Public Keys book (PDF, GitLab):


For the Debian chapter I focused on PGP keys used by the "Debian CD" group to sign .iso images used to install Debian onto new systems.

I included this chapter because I use Debian as my main workstation OS. I also know that it is used as the basis for several other popular GNU/Linux distributions such as Ubuntu and some that I've been looking into using such as PopOS.

Because the Debian organization continues to use GnuPG keys as the basis for officially authenticating developer contributions, their use of PGP keys goes back longer than most organizations covered by this book (the oldest Debian CD key I found is dated 1999-01-30).

Satoshi Labs

For the Satoshi Labs chapter I included two recent keys used to sign Trezor software such as Trezor Suite and Trezor Bridge. One of the Satoshi Labs founders named Stick also created signatures included alongside software such as Trezor Bridge. Currently the latest recommended method for using a Trezor hardware wallet is to use Trezor Suite.

I included this chapter because I use a Trezor to store bitcoin. Because money is at stake I have maintained notes about PGP keys used to sign Trezor software. In fact, one of the reasons why I decided to make the book was to gather all my notes into a single coherent text.

Future sections

In the project README I have the following entities whose public keys I am still planning to include in their own sections:

Posted 2022-01-03T13:27:01+0000

TeXmacs typesetting practice: Thermodynamics textbook

Created by Steven Baltakatei Sandoval on 2022-01-02T19:32Z under a CC BY-SA 4.0 license and last updated on 2022-01-03T09:23Z.


In 2020, I decided to retypeset an open/libre textbook called "Thermodynamics and Chemistry" authored by Dr. Howard DeVoe, Associate Professor Emeritus of University of Maryland. A PDF draft containing everything but the bibliography and problem solutions is available here.


For some time I have wanted to explore all the features the math typesetting program TeXmacs has to offer. Also, I have wanted to share what I've learned in Chemical Engineering with others. Recent improvements in remote collaboration technologies have convinced me that it is worth investing significant amounts of time sharing what I know and helping others share what they know.

My experience with early digital texts of the 2000s

Story time.

I grew up at the end of an era when textbooks did not require maintenance of an internet connection. While attending university around 2009, I recall that physics textbooks offered extra features such as access codes to private web servers that provided dynamic content such as simple physics simulators. These were amusing curiosities that I don't think many students took advantage of. The graduate students acting as teaching assistants in my freshman physics course relied more on physical demonstrations and verbal discussions rather than attempting to insert a computer screen into the mix. The access codes printed on cardstock inserts in textbooks we purchased from the university bookstore were never used.

My experience with industry-specific digital texts

I graduated university but my desire to acquire knowledge still remained. While working to maintain a supercritical carbon dioxide injection plant in remote Southern Utah, I gained a more intuitive understanding of the laws of thermodynamics. I spent many hours analyzing pressure, temperature, composition, and flowrates of real fluids. The product of the entire process was hydrocarbon liquid that people indirectly paid me to extract so they could burn it in their automobiles. It was here that an engineer named Gary Brom recommended I try using the chemical process simulation software VMGSim (which I understand has since been acquired by Schlumberger). The company I worked for paid for the license (several thousand USD per year) since I was able to use the software to make predictions to justify certain equipment setpoints. There were temperature, pressure, and flowrate setpoints that were critical in minimizing problematic issues like hydrate formation, sulfide stress cracking from H₂S, and lost compressor capacity due to high improperly stabilized condensate. VMGSim helped me explore the possibility space of different equipment configurations without needing to risk real equipment. The software, however, did not include user education in thermodynamics within its scope. One could learn much from playing with different settings and seeing how simulations responded but in order to make most of the software features useful I had to educate myself. The money I earned from solving straightforward problems (e.g. corrosive RO permeate) allowed me to purchase industry-consensus "standards" containing knowledge sold by engineers before me to organizations such as:

I understand that each of these organizations has a committee who decide on what facts and recommendations to include in each journal issue and each standard document they publish. I worked with an engineer named Philip Dickinson who worked with one of these committees in his spare time. From what I can tell, participation in these committees is unpaid. Historically, someone who wanted a document from the libraries of these organizations could receive a physical copy via a mail-order catalogue.

Reappearance of subscription libraries

However, one trend I noticed after graduating from university in 2011 was that access to knowledge from such libraries was becoming more restricted, not more available over time. I know API, NACE, ASHRAE, ASME, and ANSI only sell digital copies of their standards as PDFs protected by Digital Rights Management mechanisms such as encryption or by printing the first purchaser's personal information as a watermark on every page. Additionally, even if you purchase a physical copy from ASME or API, the copy is simply a printout of the DRM-protected PDF. This reminds the reader on every page that unless they personally purchased their particular instance, they are criminals. I imagine this also means that a university engineering library that once was able to provide students and the public with access to physical volumes are now unable to do so without violating the DRM license. I may have seen indirect evidence of this when I visited visited the Columbia University in the City of New York's engineering library in 2018-05; I found that the library had been converted into a Wi-Fi hotspot with some token bookshelves of miscellaneous texts; in order to access content you had to be enrolled as a student and have an internet connection. I had visited the library and paid for a visitor's pass thinking that while I was visiting Manhattan I would pass some afternoons sampling various fields of knowledge. I was disappointed and ended up enjoying New Jersey's Institute of Technology's engineering library (still physical as of 2018-05) instead. Similarly, My alma mater's engineering library also suffered a fate similar to that of Columbia's in 2010.

Subscription libraries are not new. Objectively speaking, the policy of engineering libraries such as those of Stanford University to permit public access to texts was a historical aberration. A private entity should be able to withhold physical access to its possessions. My dissatisfaction with Stanford's move to restrict access to physical texts (and restricting access to non-physical texts via DRM) was because these physical restrictions didn't exist when I was an undergraduate student there; Stanford Libraries seemed functionally indistinguishable from what I would call a public engineering library.

The old books-resting-on-a-shelf model at Stanford could afford to be egalitarian because the costs of maintaining public physical access (books being lost, copyright violation required someone physically taking photographs or xeroxing copies) were sufficiently low. I am not aware of students wholesale xeroxing textbooks being a significant issue. However, once the prospect of books becoming digitized arose, the initial digital infrastructure Stanford adopted for storing, indexing, and serving these new books were proprietary platforms such as JSTOR. A person browsing a wide selection of journal articles used to be able to walk down the library stacks and pick random volumes off the shelf to read. Now with JSTOR, this person would have to basically have to pay to own a private copy of each article they wanted to view. This flies in the face of the fact that upon digitization, distribution costs for books drops to near zero. Digital books, be they photographs or digitally typeset, can be copied and transmitted without needing to create and transport glue, cellulose, and ink composites across the planet. In theory, a digital engineering library should cost less to browse and explore than a physical one.

Yet, as of 2022, I know of no engineering library that meets such criteria. However, I have identified some close digital analogues may meet my requirements.

Free digital libraries

Digital libraries that actively work to keep their texts open include include:

What these organizations share in common is use of "Copyleft" licenses, specifically, Creative Commons Attribution-ShareAlike 4.0 (CC BY-SA 4.0). The idea is that an author should be able to make sure anyone who shares or resells their work cannot become a middleman that attaches new legal strings to buyers of the work. A common example of such additional legal strings is a prohibition by a publisher against copying a work or circumventing copying restrictions (see DMCA). Such strings are useful tools to improve revenue to the publisher by requiring the publisher be a middleman in every transaction involving the work. However, this requirement can be abused for purposes besides simply improving sales of a work. For example, if a middleman can make a work unavailable unless a continuous subscription fee is paid (i.e. a subscription library), then the middleman has the ability to punish users based on matters unrelated to the purchase and therefore exert political control over the users.

To illustrate, let me give an example involving Lord of the Rings.

Lord of the Rings analogy of Copyleft

Imagine if Sauron had a benevolent brother named Miło who forged a magic ring that not only gave its user supernatural abilities but also the ability to create perfect copies of itself for others to use. Let's call this Miło's Ring 1.0. Intellectual property law would be analogous to how copies of Sauron's One Ring could not be made unless they were lesser rings whose continued use required continuous acceptance of Sauron's terms and conditions that he could change at any time. Sauron used his control over users of the lesser rings to convert some of them into his servants, such as the Nazgûl ("Ringwraiths") who lacked free will.

Without some Copyleft mechanism, any one of Miło's rings could be converted into a One Ring with its own set of manipulative lesser rings bound to it; this would be analogous to releasing Miło's ring into the public domain. Unlike public domain, however, Copyleft prohibits attaching mechanisms of control to manipulate users of further copies. This prohibition is made explicit in section 2.a.5.C of CC BY-SA 4.0:

No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material.

Also, the "ShareAlike" section (3.b) offers an interesting benefit. Let's say a wizard improved Miło's Ring 1.0 to make a 2.0 version that enabled the wearer to fly in addition to being invisible. The ShareAlike section would require that the improvement not be subject to the wizard's will. For example, a wizard couldn't assassinate someone by waiting until they used Miło's Ring 2.0 to fly high above the ground and then turning off the flying ability so they came crashing down to earth.

Some would argue that publishing under copyleft licenses is inferior to publishing to the public domain. Couldn't Miło release many copies of his ring in order to outcompete Sauron's lesser rings? This might be a viable strategy were Sauron not also capable of changing the terms and conditions of his lesser rings to permit, for a time, free duplication. Sauron could then make his lesser rings indistinguishable from the public domain Miło Ring 1.0 or market them as walled garden improvements. Then, when Miło Ring 1.0 has disappeared from Middle Earth marketshare and the last users hunted down, Sauron could gain power by changing the terms and conditions of his lesser rings to control everyone by disabling free duplication and requiring obedience to his will in order to unlock features that had previously been available. This is an outcome that could have been avoided if a copyleft license had been used.

Digital publishing reduces barrier to entry

The hypothetical library I desire should have books that do not subject readers to mechanisms of control. Mechanisms of control have historically been used to enforce payment to those involved in the book production or reselling process. Therefore, my hypothetical library, unlike Stanford's subscription library, cannot rely on DRM to pay for electricity, bandwidth, server hardware, and IT staff that are all required to make copyleft texts available. However, thanks to digital technologies of today in 2021, only a small fraction of an individual person's lifespan is required to produce a book; typesetting can be performed on a computer and plant fibers need not be pulped and bleached to produce paper; research collaborators can Email or video chat instead of physically transporting their bodies via passenger ships or airliners to conferences.

These digital technologies lower the barrier to entry for more people to share their knowledge. The success of Wikipedia in 2007 is proof of this. As of 2021, 5,000 volunteers provide more than 100 article edits per month and 64,000 provide more than 5 per month ( permalink). This is evidence that many people are willing to collaborate to publish what they know on a volunteer basis.

Creative Commons as collaboration tool

I read Atlas Shrugged, a book by Ayn Rand, several times years ago in order to better understand people I knew who identified themselves as libertarians.

The story is a verbose caricature of ignoble socialists ("Moochers") who enslaved noble capitalists ("Producers") through blackmail. After several re-readings I realized Rand had given Moochers superior cooperation and collaboration abilities that enabled them to orchestrate absurdly effective laws that coerced/forced Producers to become slaves. The Producers, on the other hand, ended up not collaborating with eachother to legally defend themselves but instead encouraged one another to go on strike in order to starve the Moochers.

I bring the story up because it is a story that defends property rights of the individual which the Creative Commons BY-SA license arguably diminishes. In the John Galt chapter, a causal (not "casual"!) chain of inventor to investor to manufacturer to operator is described through which goods are produced. Between each link a higher entity dictates terms and conditions that restrict the actions of the lower entity. For example, an inventor may sell a computer program critical to the operation of a power plant to an investor under the condition that the program not be shared with anyone else without the inventor's permission; if the investor violates this contract then they may be required, ultimately under threat of government-sanctioned violent force, to pay fine. The contract could be encoded in a license that the inventor writes and transmits to the investor. To enforce their conditions, the inventor may choose to utilize a DRM scheme that remotely shuts off the program if they aren't paid a monthly subscription fee.

As I described with Miło's ring, DRM, a methods of control, can be made into a method of coercion. To illustrate in the context of Atlas Shrugged, imagine a modern rewrite in which Producers extinguish the lights of New York City not by a decade-long campaign to convince engineers to retire but instead by neglecting cloud authentication servers necessary to operate critical infrastructure. If a power plant requires programs running in a third party datacenter (e.g. Amazon or Microsoft) then that power plant operates at that third party's pleasure. Similarly, the inventor in my example could shut down the power plant by activating the program's DRM killswitch. DRM preserves property rights of upstream entities (e.g. inventors, programmers, writers) at the cost of restricting freedoms for downstream entities.

However, if the inventor used a Creative Commons BY-SA license, then such methods of control would not be necessary. Instead of using a copyright license to secure direct compensation, a Creative Commons license could be used as a collaboration tool. Like-minded Producers could freely contribute improvements to the program. Each Producer's experience with the operation of the improved program allows them to demand more compensation when selling their time implementing the program. For example, Red Hat, Inc., is a company that sells commercial support for free software such as GNU/Linux. Red Hat itself does not sell software so much as it sells support for software. Profits are used to purchase closed software in order to release them under free licenses, such as the GNU General Public License (GPL). Thus, newly freed software can be improved by anyone. Producers of software improvements can then charge support fees as consultants not necessarily tied to Red Hat. Although the Creative Commons BY-SA license is not recommended to be applied to software as the GPL is, both promote collaboration by giving recipients rights to copy, modify, and redistribute works.

Free and Open Source engineering textbooks

When I imagine my ideal public digital enginerring library, the texts it contains are licensed Creative Commons BY-SA. For reasons stated above, the quality of the texts could be incrementally improved by anyone. If I were to contribute or improve a text in such a library, I would not want my improvements to be used by a corporate Sauron to lure users into being controlled against their will.

As indirectly mentioned earlier, a nascent form of this library exists on Stack Exchange in the form of the Engineering section. Similar to Wikipedia, Stack Exchange requires that users submit their articles and comments under a Creative Commons BY-SA license. As of 2021, their website supports MathJax rendering of equations, permitting clear communication of math equations encoded with LaTeX markup. I have made some contributions (e.g. "How to calculate kinetic energy of gas flow in pipe given tempearture, pressure, pipe diameter and velocity") to this forum but the question-and-answer format promotes narrow-scope answers to specific questions. This format helps readers quickly find specific answers if they know how to form a query for a search engine. In contrast, an educational textbook is typically a more comprehensive work that provides a sufficiently broad background of a topic so the reader can answer a wide array of questions they may not have been able to articulate for themselves.

My Contribution


I have a desire to share my knowledge because I believe my life can be personally improved if more people such as myself share their knowledge. I have given some thought into how I can make this contribution. As of 2022, given my background in chemical engineering and knowledge about free licenses, I've decided to search for and improve a free and open source thermodynamics textbook.

The Book

I identified a textbook called "Thermodynamics and Chemistry, 2nd Edition" by Howard DeVoe, Associate Professor Emeritus of the University of Maryland. The textbook is published under the Creative Commons Attribution 4.0 International License (CC BY). This means I can create and even sell a derivative work so long as I attribute DeVoe as the original work's author.

So, I created a derivative work here on GitLab (PDF).

Transcribing the Book with TeXmacs

Dr. DeVoe courteously provided me with a copy of the LaTeX source code he used to compile the PDF files published on his website. Throughout 2021, I transcribed this source code into the markup used by GNU TeXmacs, a WYSIWYG typesetting program developed primarily by a researcher named Joris van der Hoeven.

I chose TeXmacs over a more generic LaTeX editor because:

  • It is part of the GNU project, meaning it is free/libre open source.
  • It uses a tree-like data structure for its source code.
  • The source code is not meant to be modified by a generic ASCII editor. Edits are meant to immediately provide visual feedback of typesetting details.
  • It has Emacs shortcuts available for fast keyboard navigation.

I had heard from a friend in graduate school that there wasn't a clear choice of software for creating LaTeX documents. As I understand the situation in academica, one writes LaTeX source in a text editor then renders it to a PDF to see the result. Additionally, the core typesetting software can be extended with plug-ins for which I could not find a package manager. It seems every graduate school lab has their own particular traditions for submitting typesetting source code to publishers. My friend recommended Overleaf, a browser-based WYSIWYG editor, but I have yet to be convinced that it can handle the textbook features that I have found TeXmacs to be capable of handling (e.g. automatic table-of-contents genration, automatic footnote and equation numbering).

On 2021-06-29, I published a blog post about how I reached the milestone of transcribing all fourteen chapters.

Current status

As of 2022-01-02, I am still fleshing out the Bibliography, Biography, and Appendices. However, the bulk of the main text, Table of Contents, and Index are functioning. A PDF draft is available here.

If you are interested in helping contribute changes, you can contact me via the GitLab project page, Twitter or snail mail.

Future plans

In 2022, I plan on transcribing the Solutions Manual. I skipped these sections to save time.

I also plan on integrating examples and solutions that a reader can verify using free/libre open source chemical process simulation software such as DWSIM. For instance, without such software, a problem about a steam electric generator would likely require the engineer to solve energy and mass balances by looking up values from large tables called steam tables. As I mentioned earlier, such software helped me daily to tune setpoints for heat exchangers, gas compressors, and other process units of chemical engineering. The source code for such process models would be as important to the textbook as captioned images or tables.


In 2021 I transcribed a thermodynamics textbook to satisfy a desire I had to share some of what I learned in the field of chemical engineering. The textbook source code is published on GitLab under a Creative Commons BY-SA license so that others can improve it. A compiled PDF draft is available here.

Posted 2022-01-03T08:47:50+0000

Notable Public Keys book

Created by Steven Baltakatei Sandoval on 2021-07-19T22:52Z under a CC BY-SA 4.0 license and last updated on 2021-07-20T02:09Z.


I decided to write a book with some notes I have been keeping regarding public keys I have spent some time verifying over the years for my own purposes.

Current use of public key cryptography

Keyservers such as permit automated distribution of public keys. Some key owners attest to the identity of other key owners by means of digital signatures attached to each others' public keys. However, some esoteric technical expertise is required to examine this machine-readable signature data.

In response, services such as have appeared that offer to take care of these details, letting people manage their online identities through a web user interface. By default, a user's PGP public key fingerprint is displayed prominently on a user's profile page. With these keys, Keybase offers people services such as end-to-end encrypted messaging that do not require Keybase to be able to see the contents of the messages.

Other services such Signal go a step futher and don't even require the user to even know what a public key is. End-to-end encryption is used using public key cryptography but with user identity details determined through SMS confirmations. Fingerprint comparisons for each encrypted conversation are instead an optional feature called a "Safety Number" which a user may or may not choose to verify.

In the interest of achieving cryptography "at scale", these systems (,, store information about which keys are to be trusted in machine-readable format. Signal users can form their own networks of trusts by instructing their phones to trust other users' phones through Safety Number information. Keybase users can do likewise with Keybase servers or Keybase software running on their devices. In each of these situations, a machine manages trust information according to human decisions.

A case not covered by existing software

There is a subtle assumption that the user has a close enough relationship with another user to be able to rationally decide to codify that relationship. This is no hurdle for Tom who uses Signal with his siblings since he likely is familiar with their idiosyncratic behaviors. However, what if Tom received a news story from his sibling Jess about serious allegations of criminal acts of Tom's favorite politician that he votes for? The news story was written by a reporter Tom has never heard of. Since Tom's voting behavior is at stake, he will want to examine the provenance of these allegations, perhaps starting with the reputation of the reporter. While Tom may be able to trust that the news story did come from Jess thanks to Signal, even if the reporter used Signal, they would likely be unable to respond to individual questions from many interested people such as Tom.

There is an imbalance between the need of the many to verify and the need of the popular to prove. Typically, this imbalance is partially satisfied by use of Transport Layer Security (TLS) and public keys known as "certificates" stored in popular web browsers; this is the "green lockbox" icon seen at the top of browsers. As of 2021, web sites lacking valid certificates trigger some form of warning to the user (e.g. a warning page or a red cross icon near the browser's address bar). TLS used by the news story's parent site would give Tom a handhold in his task to verify the reporter's reputation. He could at least trust the integrity of other works by the reporter, allowing him to build up a story about who this reporter really is and what they really know.

It is this need to construct an internally consistent and plausible "story" that I identified as a need that apps such as Keybase and Signal do not readily fulfill. The situation Tom finds himself in is similar to situations I often find myself in. I wish to fulfill my civic duty by understanding political candidates so I can vote for the best one. I wish to protect my devices against malware by understanding the provenance of different software packages available and installing the best one. In both of these situations I find myself assembling notes about the identities of notable individuals. And since I live in an era where public key cryptography exists, I find myself buliding stories around public keys.

Stories about public keys

Over time I have accumulated notes and memories identifying the public keys used by entities whose works I use. Last week I decided to assemble the content of these notes into the form of a TeXmacs book. Yesterday I composed three sections of a book I am assembling in One section was about keys used to sign executable binaries of the reference implementation of the Bitcoin protocol, Bitcoin Core. Another section was about the key GitHub uses to automatically sign commits made using its web interface. The third section was of a key used to sign SD card images of RaspiBlitz, a software package I am testing for running a Lightning Node for Bitcoin microtransactions. The book relies heavily on the Internet Achive's Wayback Machine for hosting long-term links to webpages that serve as evidence for my claims.

Other subjects I have in mind to write future sections about inculde:

  • Tails - A privacy-focused operating system based on Debian.
  • Debian - A GNU/Linux distribution from which several notable distributions are derived (e.g. Tails, Ubuntu).
  • New York Times - A notable newspaper that accepts confidential tips via PGP-encrypted email, Signal, and WhatsApp.
  • Veracrypt - Encryption software. Successor to Truecrypt.
  • Tor Browser - Private web browser.
  • F-Droid - An open-source android software repository.

If browser certificates and esoteric gpg commands are on one end of a spectrum, this book is at the opposite end. Keyserver data is machine-readable; this book is meant to be human-readable. A signature packet on a PGP key is not designed to note political rivalries between cryptocurrency forks. Conflict makes human stories more interesting. Plenty of news articles and social media posts exist that mention individuals who own public keys but I am not aware of a publication that attempts to focus on identifying public keys with prose.


I'm writing a book containing my notes to help people identify public keys of notable entities. I'm doing this because I want to offer an alternative, however small, to machine-oriented methods. Also, I have been collecting these notes anyway for myself so I may as well help others while I am at it.

I'm mirroring the source for the book to GitLab.

Posted 2021-07-20T02:09:58+0000

This blog is powered by ikiwiki.

Text is available under the Creative Commons Attribution-ShareAlike license; additional terms may apply.