RSS Atom Add a new post titled:

Notable Public Keys book update (Debian, Satoshi Labs)

Created by Steven Baltakatei Sandoval on 2022-01-03T12:58Z under a CC BY-SA 4.0 license and last updated on 2022-01-03T13:26Z.

I added two sections to my Notable Public Keys book (PDF, GitLab):

Debian

For the Debian chapter I focused on PGP keys used by the "Debian CD" group to sign .iso images used to install Debian onto new systems.

I included this chapter because I use Debian as my main workstation OS. I also know that it is used as the basis for several other popular GNU/Linux distributions such as Ubuntu and some that I've been looking into using such as PopOS.

Because the Debian organization continues to use GnuPG keys as the basis for officially authenticating developer contributions, their use of PGP keys goes back longer than most organizations covered by this book (the oldest Debian CD key I found is dated 1999-01-30).

Satoshi Labs

For the Satoshi Labs chapter I included two recent keys used to sign Trezor software such as Trezor Suite and Trezor Bridge. One of the Satoshi Labs founders named Stick also created signatures included alongside software such as Trezor Bridge. Currently the latest recommended method for using a Trezor hardware wallet is to use Trezor Suite.

I included this chapter because I use a Trezor to store bitcoin. Because money is at stake I have maintained notes about PGP keys used to sign Trezor software. In fact, one of the reasons why I decided to make the book was to gather all my notes into a single coherent text.

Future sections

In the project README I have the following entities whose public keys I am still planning to include in their own sections:

Posted 2022-01-03T13:27:01+0000

TeXmacs typesetting practice: Thermodynamics textbook

Created by Steven Baltakatei Sandoval on 2022-01-02T19:32Z under a CC BY-SA 4.0 license and last updated on 2022-01-03T09:23Z.

Summary

In 2020, I decided to retypeset an open/libre textbook called "Thermodynamics and Chemistry" authored by Dr. Howard DeVoe, Associate Professor Emeritus of University of Maryland. A PDF draft containing everything but the bibliography and problem solutions is available here.

Background

For some time I have wanted to explore all the features the math typesetting program TeXmacs has to offer. Also, I have wanted to share what I've learned in Chemical Engineering with others. Recent improvements in remote collaboration technologies have convinced me that it is worth investing significant amounts of time sharing what I know and helping others share what they know.

My experience with early digital texts of the 2000s

Story time.

I grew up at the end of an era when textbooks did not require maintenance of an internet connection. While attending university around 2009, I recall that physics textbooks offered extra features such as access codes to private web servers that provided dynamic content such as simple physics simulators. These were amusing curiosities that I don't think many students took advantage of. The graduate students acting as teaching assistants in my freshman physics course relied more on physical demonstrations and verbal discussions rather than attempting to insert a computer screen into the mix. The access codes printed on cardstock inserts in textbooks we purchased from the university bookstore were never used.

My experience with industry-specific digital texts

I graduated university but my desire to acquire knowledge still remained. While working to maintain a supercritical carbon dioxide injection plant in remote Southern Utah, I gained a more intuitive understanding of the laws of thermodynamics. I spent many hours analyzing pressure, temperature, composition, and flowrates of real fluids. The product of the entire process was hydrocarbon liquid that people indirectly paid me to extract so they could burn it in their automobiles. It was here that an engineer named Gary Brom recommended I try using the chemical process simulation software VMGSim (which I understand has since been acquired by Schlumberger). The company I worked for paid for the license (several thousand USD per year) since I was able to use the software to make predictions to justify certain equipment setpoints. There were temperature, pressure, and flowrate setpoints that were critical in minimizing problematic issues like hydrate formation, sulfide stress cracking from H₂S, and lost compressor capacity due to high improperly stabilized condensate. VMGSim helped me explore the possibility space of different equipment configurations without needing to risk real equipment. The software, however, did not include user education in thermodynamics within its scope. One could learn much from playing with different settings and seeing how simulations responded but in order to make most of the software features useful I had to educate myself. The money I earned from solving straightforward problems (e.g. corrosive RO permeate) allowed me to purchase industry-consensus "standards" containing knowledge sold by engineers before me to organizations such as:

I understand that each of these organizations has a committee who decide on what facts and recommendations to include in each journal issue and each standard document they publish. I worked with an engineer named Philip Dickinson who worked with one of these committees in his spare time. From what I can tell, participation in these committees is unpaid. Historically, someone who wanted a document from the libraries of these organizations could receive a physical copy via a mail-order catalogue.

Reappearance of subscription libraries

However, one trend I noticed after graduating from university in 2011 was that access to knowledge from such libraries was becoming more restricted, not more available over time. I know API, NACE, ASHRAE, ASME, and ANSI only sell digital copies of their standards as PDFs protected by Digital Rights Management mechanisms such as encryption or by printing the first purchaser's personal information as a watermark on every page. Additionally, even if you purchase a physical copy from ASME or API, the copy is simply a printout of the DRM-protected PDF. This reminds the reader on every page that unless they personally purchased their particular instance, they are criminals. I imagine this also means that a university engineering library that once was able to provide students and the public with access to physical volumes are now unable to do so without violating the DRM license. I may have seen indirect evidence of this when I visited visited the Columbia University in the City of New York's engineering library in 2018-05; I found that the library had been converted into a Wi-Fi hotspot with some token bookshelves of miscellaneous texts; in order to access content you had to be enrolled as a student and have an internet connection. I had visited the library and paid for a visitor's pass thinking that while I was visiting Manhattan I would pass some afternoons sampling various fields of knowledge. I was disappointed and ended up enjoying New Jersey's Institute of Technology's engineering library (still physical as of 2018-05) instead. Similarly, My alma mater's engineering library also suffered a fate similar to that of Columbia's in 2010.

Subscription libraries are not new. Objectively speaking, the policy of engineering libraries such as those of Stanford University to permit public access to texts was a historical aberration. A private entity should be able to withhold physical access to its possessions. My dissatisfaction with Stanford's move to restrict access to physical texts (and restricting access to non-physical texts via DRM) was because these physical restrictions didn't exist when I was an undergraduate student there; Stanford Libraries seemed functionally indistinguishable from what I would call a public engineering library.

The old books-resting-on-a-shelf model at Stanford could afford to be egalitarian because the costs of maintaining public physical access (books being lost, copyright violation required someone physically taking photographs or xeroxing copies) were sufficiently low. I am not aware of students wholesale xeroxing textbooks being a significant issue. However, once the prospect of books becoming digitized arose, the initial digital infrastructure Stanford adopted for storing, indexing, and serving these new books were proprietary platforms such as JSTOR. A person browsing a wide selection of journal articles used to be able to walk down the library stacks and pick random volumes off the shelf to read. Now with JSTOR, this person would have to basically have to pay to own a private copy of each article they wanted to view. This flies in the face of the fact that upon digitization, distribution costs for books drops to near zero. Digital books, be they photographs or digitally typeset, can be copied and transmitted without needing to create and transport glue, cellulose, and ink composites across the planet. In theory, a digital engineering library should cost less to browse and explore than a physical one.

Yet, as of 2022, I know of no engineering library that meets such criteria. However, I have identified some close digital analogues may meet my requirements.

Free digital libraries

Digital libraries that actively work to keep their texts open include include:

What these organizations share in common is use of "Copyleft" licenses, specifically, Creative Commons Attribution-ShareAlike 4.0 (CC BY-SA 4.0). The idea is that an author should be able to make sure anyone who shares or resells their work cannot become a middleman that attaches new legal strings to buyers of the work. A common example of such additional legal strings is a prohibition by a publisher against copying a work or circumventing copying restrictions (see DMCA). Such strings are useful tools to improve revenue to the publisher by requiring the publisher be a middleman in every transaction involving the work. However, this requirement can be abused for purposes besides simply improving sales of a work. For example, if a middleman can make a work unavailable unless a continuous subscription fee is paid (i.e. a subscription library), then the middleman has the ability to punish users based on matters unrelated to the purchase and therefore exert political control over the users.

To illustrate, let me give an example involving Lord of the Rings.

Lord of the Rings analogy of Copyleft

Imagine if Sauron had a benevolent brother named Miło who forged a magic ring that not only gave its user supernatural abilities but also the ability to create perfect copies of itself for others to use. Let's call this Miło's Ring 1.0. Intellectual property law would be analogous to how copies of Sauron's One Ring could not be made unless they were lesser rings whose continued use required continuous acceptance of Sauron's terms and conditions that he could change at any time. Sauron used his control over users of the lesser rings to convert some of them into his servants, such as the Nazgûl ("Ringwraiths") who lacked free will.

Without some Copyleft mechanism, any one of Miło's rings could be converted into a One Ring with its own set of manipulative lesser rings bound to it; this would be analogous to releasing Miło's ring into the public domain. Unlike public domain, however, Copyleft prohibits attaching mechanisms of control to manipulate users of further copies. This prohibition is made explicit in section 2.a.5.C of CC BY-SA 4.0:

No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material.

Also, the "ShareAlike" section (3.b) offers an interesting benefit. Let's say a wizard improved Miło's Ring 1.0 to make a 2.0 version that enabled the wearer to fly in addition to being invisible. The ShareAlike section would require that the improvement not be subject to the wizard's will. For example, a wizard couldn't assassinate someone by waiting until they used Miło's Ring 2.0 to fly high above the ground and then turning off the flying ability so they came crashing down to earth.

Some would argue that publishing under copyleft licenses is inferior to publishing to the public domain. Couldn't Miło release many copies of his ring in order to outcompete Sauron's lesser rings? This might be a viable strategy were Sauron not also capable of changing the terms and conditions of his lesser rings to permit, for a time, free duplication. Sauron could then make his lesser rings indistinguishable from the public domain Miło Ring 1.0 or market them as walled garden improvements. Then, when Miło Ring 1.0 has disappeared from Middle Earth marketshare and the last users hunted down, Sauron could gain power by changing the terms and conditions of his lesser rings to control everyone by disabling free duplication and requiring obedience to his will in order to unlock features that had previously been available. This is an outcome that could have been avoided if a copyleft license had been used.

Digital publishing reduces barrier to entry

The hypothetical library I desire should have books that do not subject readers to mechanisms of control. Mechanisms of control have historically been used to enforce payment to those involved in the book production or reselling process. Therefore, my hypothetical library, unlike Stanford's subscription library, cannot rely on DRM to pay for electricity, bandwidth, server hardware, and IT staff that are all required to make copyleft texts available. However, thanks to digital technologies of today in 2021, only a small fraction of an individual person's lifespan is required to produce a book; typesetting can be performed on a computer and plant fibers need not be pulped and bleached to produce paper; research collaborators can Email or video chat instead of physically transporting their bodies via passenger ships or airliners to conferences.

These digital technologies lower the barrier to entry for more people to share their knowledge. The success of Wikipedia in 2007 is proof of this. As of 2021, 5,000 volunteers provide more than 100 article edits per month and 64,000 provide more than 5 per month (Wikimedia.org permalink). This is evidence that many people are willing to collaborate to publish what they know on a volunteer basis.

Creative Commons as collaboration tool

I read Atlas Shrugged, a book by Ayn Rand, several times years ago in order to better understand people I knew who identified themselves as libertarians.

The story is a verbose caricature of ignoble socialists ("Moochers") who enslaved noble capitalists ("Producers") through blackmail. After several re-readings I realized Rand had given Moochers superior cooperation and collaboration abilities that enabled them to orchestrate absurdly effective laws that coerced/forced Producers to become slaves. The Producers, on the other hand, ended up not collaborating with eachother to legally defend themselves but instead encouraged one another to go on strike in order to starve the Moochers.

I bring the story up because it is a story that defends property rights of the individual which the Creative Commons BY-SA license arguably diminishes. In the John Galt chapter, a causal (not "casual"!) chain of inventor to investor to manufacturer to operator is described through which goods are produced. Between each link a higher entity dictates terms and conditions that restrict the actions of the lower entity. For example, an inventor may sell a computer program critical to the operation of a power plant to an investor under the condition that the program not be shared with anyone else without the inventor's permission; if the investor violates this contract then they may be required, ultimately under threat of government-sanctioned violent force, to pay fine. The contract could be encoded in a license that the inventor writes and transmits to the investor. To enforce their conditions, the inventor may choose to utilize a DRM scheme that remotely shuts off the program if they aren't paid a monthly subscription fee.

As I described with Miło's ring, DRM, a methods of control, can be made into a method of coercion. To illustrate in the context of Atlas Shrugged, imagine a modern rewrite in which Producers extinguish the lights of New York City not by a decade-long campaign to convince engineers to retire but instead by neglecting cloud authentication servers necessary to operate critical infrastructure. If a power plant requires programs running in a third party datacenter (e.g. Amazon or Microsoft) then that power plant operates at that third party's pleasure. Similarly, the inventor in my example could shut down the power plant by activating the program's DRM killswitch. DRM preserves property rights of upstream entities (e.g. inventors, programmers, writers) at the cost of restricting freedoms for downstream entities.

However, if the inventor used a Creative Commons BY-SA license, then such methods of control would not be necessary. Instead of using a copyright license to secure direct compensation, a Creative Commons license could be used as a collaboration tool. Like-minded Producers could freely contribute improvements to the program. Each Producer's experience with the operation of the improved program allows them to demand more compensation when selling their time implementing the program. For example, Red Hat, Inc., is a company that sells commercial support for free software such as GNU/Linux. Red Hat itself does not sell software so much as it sells support for software. Profits are used to purchase closed software in order to release them under free licenses, such as the GNU General Public License (GPL). Thus, newly freed software can be improved by anyone. Producers of software improvements can then charge support fees as consultants not necessarily tied to Red Hat. Although the Creative Commons BY-SA license is not recommended to be applied to software as the GPL is, both promote collaboration by giving recipients rights to copy, modify, and redistribute works.

Free and Open Source engineering textbooks

When I imagine my ideal public digital enginerring library, the texts it contains are licensed Creative Commons BY-SA. For reasons stated above, the quality of the texts could be incrementally improved by anyone. If I were to contribute or improve a text in such a library, I would not want my improvements to be used by a corporate Sauron to lure users into being controlled against their will.

As indirectly mentioned earlier, a nascent form of this library exists on Stack Exchange in the form of the Engineering section. Similar to Wikipedia, Stack Exchange requires that users submit their articles and comments under a Creative Commons BY-SA license. As of 2021, their website supports MathJax rendering of equations, permitting clear communication of math equations encoded with LaTeX markup. I have made some contributions (e.g. "How to calculate kinetic energy of gas flow in pipe given tempearture, pressure, pipe diameter and velocity") to this forum but the question-and-answer format promotes narrow-scope answers to specific questions. This format helps readers quickly find specific answers if they know how to form a query for a search engine. In contrast, an educational textbook is typically a more comprehensive work that provides a sufficiently broad background of a topic so the reader can answer a wide array of questions they may not have been able to articulate for themselves.

My Contribution

Motivation

I have a desire to share my knowledge because I believe my life can be personally improved if more people such as myself share their knowledge. I have given some thought into how I can make this contribution. As of 2022, given my background in chemical engineering and knowledge about free licenses, I've decided to search for and improve a free and open source thermodynamics textbook.

The Book

I identified a textbook called "Thermodynamics and Chemistry, 2nd Edition" by Howard DeVoe, Associate Professor Emeritus of the University of Maryland. The textbook is published under the Creative Commons Attribution 4.0 International License (CC BY). This means I can create and even sell a derivative work so long as I attribute DeVoe as the original work's author.

So, I created a derivative work here on GitLab (PDF).

Transcribing the Book with TeXmacs

Dr. DeVoe courteously provided me with a copy of the LaTeX source code he used to compile the PDF files published on his website. Throughout 2021, I transcribed this source code into the markup used by GNU TeXmacs, a WYSIWYG typesetting program developed primarily by a researcher named Joris van der Hoeven.

I chose TeXmacs over a more generic LaTeX editor because:

  • It is part of the GNU project, meaning it is free/libre open source.
  • It uses a tree-like data structure for its source code.
  • The source code is not meant to be modified by a generic ASCII editor. Edits are meant to immediately provide visual feedback of typesetting details.
  • It has Emacs shortcuts available for fast keyboard navigation.

I had heard from a friend in graduate school that there wasn't a clear choice of software for creating LaTeX documents. As I understand the situation in academica, one writes LaTeX source in a text editor then renders it to a PDF to see the result. Additionally, the core typesetting software can be extended with plug-ins for which I could not find a package manager. It seems every graduate school lab has their own particular traditions for submitting typesetting source code to publishers. My friend recommended Overleaf, a browser-based WYSIWYG editor, but I have yet to be convinced that it can handle the textbook features that I have found TeXmacs to be capable of handling (e.g. automatic table-of-contents genration, automatic footnote and equation numbering).

On 2021-06-29, I published a blog post about how I reached the milestone of transcribing all fourteen chapters.

Current status

As of 2022-01-02, I am still fleshing out the Bibliography, Biography, and Appendices. However, the bulk of the main text, Table of Contents, and Index are functioning. A PDF draft is available here.

If you are interested in helping contribute changes, you can contact me via the GitLab project page, Twitter or snail mail.

Future plans

In 2022, I plan on transcribing the Solutions Manual. I skipped these sections to save time.

I also plan on integrating examples and solutions that a reader can verify using free/libre open source chemical process simulation software such as DWSIM. For instance, without such software, a problem about a steam electric generator would likely require the engineer to solve energy and mass balances by looking up values from large tables called steam tables. As I mentioned earlier, such software helped me daily to tune setpoints for heat exchangers, gas compressors, and other process units of chemical engineering. The source code for such process models would be as important to the textbook as captioned images or tables.

Summary

In 2021 I transcribed a thermodynamics textbook to satisfy a desire I had to share some of what I learned in the field of chemical engineering. The textbook source code is published on GitLab under a Creative Commons BY-SA license so that others can improve it. A compiled PDF draft is available here.

Posted 2022-01-03T08:47:50+0000

Notable Public Keys book

Created by Steven Baltakatei Sandoval on 2021-07-19T22:52Z under a CC BY-SA 4.0 license and last updated on 2021-07-20T02:09Z.

Summary

I decided to write a book with some notes I have been keeping regarding public keys I have spent some time verifying over the years for my own purposes.

Current use of public key cryptography

Keyservers such as keys.openpgp.org permit automated distribution of public keys. Some key owners attest to the identity of other key owners by means of digital signatures attached to each others' public keys. However, some esoteric technical expertise is required to examine this machine-readable signature data.

In response, services such as keybase.io have appeared that offer to take care of these details, letting people manage their online identities through a web user interface. By default, a user's PGP public key fingerprint is displayed prominently on a user's profile page. With these keys, Keybase offers people services such as end-to-end encrypted messaging that do not require Keybase to be able to see the contents of the messages.

Other services such Signal go a step futher and don't even require the user to even know what a public key is. End-to-end encryption is used using public key cryptography but with user identity details determined through SMS confirmations. Fingerprint comparisons for each encrypted conversation are instead an optional feature called a "Safety Number" which a user may or may not choose to verify.

In the interest of achieving cryptography "at scale", these systems (keys.openpgp.org, keybase.io, signal.org) store information about which keys are to be trusted in machine-readable format. Signal users can form their own networks of trusts by instructing their phones to trust other users' phones through Safety Number information. Keybase users can do likewise with Keybase servers or Keybase software running on their devices. In each of these situations, a machine manages trust information according to human decisions.

A case not covered by existing software

There is a subtle assumption that the user has a close enough relationship with another user to be able to rationally decide to codify that relationship. This is no hurdle for Tom who uses Signal with his siblings since he likely is familiar with their idiosyncratic behaviors. However, what if Tom received a news story from his sibling Jess about serious allegations of criminal acts of Tom's favorite politician that he votes for? The news story was written by a reporter Tom has never heard of. Since Tom's voting behavior is at stake, he will want to examine the provenance of these allegations, perhaps starting with the reputation of the reporter. While Tom may be able to trust that the news story did come from Jess thanks to Signal, even if the reporter used Signal, they would likely be unable to respond to individual questions from many interested people such as Tom.

There is an imbalance between the need of the many to verify and the need of the popular to prove. Typically, this imbalance is partially satisfied by use of Transport Layer Security (TLS) and public keys known as "certificates" stored in popular web browsers; this is the "green lockbox" icon seen at the top of browsers. As of 2021, web sites lacking valid certificates trigger some form of warning to the user (e.g. a warning page or a red cross icon near the browser's address bar). TLS used by the news story's parent site would give Tom a handhold in his task to verify the reporter's reputation. He could at least trust the integrity of other works by the reporter, allowing him to build up a story about who this reporter really is and what they really know.

It is this need to construct an internally consistent and plausible "story" that I identified as a need that apps such as Keybase and Signal do not readily fulfill. The situation Tom finds himself in is similar to situations I often find myself in. I wish to fulfill my civic duty by understanding political candidates so I can vote for the best one. I wish to protect my devices against malware by understanding the provenance of different software packages available and installing the best one. In both of these situations I find myself assembling notes about the identities of notable individuals. And since I live in an era where public key cryptography exists, I find myself buliding stories around public keys.

Stories about public keys

Over time I have accumulated notes and memories identifying the public keys used by entities whose works I use. Last week I decided to assemble the content of these notes into the form of a TeXmacs book. Yesterday I composed three sections of a book I am assembling in One section was about keys used to sign executable binaries of the reference implementation of the Bitcoin protocol, Bitcoin Core. Another section was about the key GitHub uses to automatically sign commits made using its web interface. The third section was of a key used to sign SD card images of RaspiBlitz, a software package I am testing for running a Lightning Node for Bitcoin microtransactions. The book relies heavily on the Internet Achive's Wayback Machine for hosting long-term links to webpages that serve as evidence for my claims.

Other subjects I have in mind to write future sections about inculde:

  • Tails - A privacy-focused operating system based on Debian.
  • Debian - A GNU/Linux distribution from which several notable distributions are derived (e.g. Tails, Ubuntu).
  • New York Times - A notable newspaper that accepts confidential tips via PGP-encrypted email, Signal, and WhatsApp.
  • Veracrypt - Encryption software. Successor to Truecrypt.
  • Tor Browser - Private web browser.
  • F-Droid - An open-source android software repository.

If browser certificates and esoteric gpg commands are on one end of a spectrum, this book is at the opposite end. Keyserver data is machine-readable; this book is meant to be human-readable. A signature packet on a PGP key is not designed to note political rivalries between cryptocurrency forks. Conflict makes human stories more interesting. Plenty of news articles and social media posts exist that mention individuals who own public keys but I am not aware of a publication that attempts to focus on identifying public keys with prose.

Conclusion

I'm writing a book containing my notes to help people identify public keys of notable entities. I'm doing this because I want to offer an alternative, however small, to machine-oriented methods. Also, I have been collecting these notes anyway for myself so I may as well help others while I am at it.

I'm mirroring the source for the book to GitLab.

Posted 2021-07-20T02:09:58+0000

Big Lumber PGP key listing

Created by Steven Baltakatei Sandoval on 2021-06-29T19:30Z under a CC BY-SA 4.0 license and last updated on 2021-06-29T23:25Z.

Summary

I updated my PGP key listing on Big Lumber.

pub   rsa4096/0xA0A295ABDC3469C9 2017-10-11 [C] [expires: 2022-07-08]
      Key fingerprint = 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
uid                   [ultimate] Steven Sandoval <baltakatei@gmail.com>

Background

Key origin

In 2017, past-me created an OpenPGP key using GnuPG (link). It is an rsa4096 key which, at the time, was I understood was the recommended method used by signatures used to sign software like Debian.

I intended to create a long-term method of asserting my identity to others in a decentralized fashion. Since then, I've learned of other tools (e.g. minisign ) whose advertised value is based on their narrower application scopes and the simplicity (see "Complexity is the worst enemy of security"). However, the effort required to maintain my PGP key is relatively minimal. I just have to use it. Anyone interested in verifying digital signatures for files I create/modify can do so with the appropriate software. That's the beauty of software designed with decentralization in mind. You can download the git repository of this blog and use git (currently git version 2.20.1) to evaluate the digital signatures indicating that I composed this message. I'm aware of cryptography people who have opinions and would probably try to sway me to use some other scheme but what I have works.

Example use-case in fiction

Basically, past-me wanted the ability to establish a secure remote communication channel with someone who I might only be able to meet in-person for a very short period of time. I loved seeing this scenario portrayed in one of my favorite books, Anathem by Neal Stephenson. In the quote below, the narrator and his colleagues are being collected into a vehicle by military personnel in order to evacuate following a disasterous event. They have only a matter of minutes before they are physically separated for possibly a very long time. One friend, an Ita named Sammann, plays the role of information technology consultant.

A young female Ita came in, followed by a very old male one. They stood around Sammann for a few minutes, reciting numbers to one another. I fancied that we were going to have three Ita in our cell, but then the two visitors walked off the coach and we did not see them again.

The narrator does not explicitly explain what occured but I infer that Sammann performed a public key signing party with the help of some voice recording device. All that is necessary is that a person record a key fingerprint (consisting of a large number) and remember who gave them that number.

Current usage

My PGP key uses rsa4096 which produces prime-number signatures) that are bulkier than elliptic curve signatures but they satisfy my need to sign my blog posts. I occasionally use it to encrypt files but for regular backups produced by bash scripts running on small devices (e.g. my personal time server, my environmental sensors) I run age (pronounced "ah-gay").

My signed blog posts are currently my primary method for maintaining my digital identity under my terms. Additionally, I publish this blog under the reboil.com domain, which I own; I make use of the fact that domain name space is a limited resource, especially for .com top-level domains. If anyone wanted to impersonate me to someone aware that I used reboil.com, they'd have to take over that domain somehow. Also, the Internet Archive would hopefully keep my blog post history occasionally secured.

Big Lumber key upload

One auxiliary function of PGP keys is to sign other peoples' PGP keys who may be interested in establishing their own digital identity. Due to the COVID-19 pandemic I have not wanted to risk attending Linux conferences to try and meet others to do so.

However, if someone wanted to meet me, they could invite me to sign their PGP key. I'd be willing to have lunch or something. For that possibility, I am creating an entry on Big Lumber, an older website used by people interested in meeting other people to sign keys. I live in the area of Vancouver, Washington.

I'm adding a link to this blog post in my new listing. Like previous listings, I am setting an expiry date at the end of this year. I'll make another listing in 2021.

Using my key

The rest of this blog post is optional reading.

I will illustrate one use case for my public key: verifying that this blog post was signed by my public key.

Importing

If you want to import my minimal public key to your GNU/Linux systems, save the .asc file at this link to baltakatei.asc and then run:

$ cat baltakatei.asc | gpg --import

My current version of GnuPG is 2.2.12.

Verify Git commits

With Git version 2.20.1, and GnuPG 2.2.12, running the following commands will let you verify this website's git repository. This assumes you've imported my public key into gpg as described earlier.

$ git clone https://reboil.com/gitweb/ikiwiki-blog.git BK-2020-08-1
$ cd BK-2020-08-1
$ git submodule 
$ git log --show-signature

Note, the URL to this website's git repository should be located at the bottom of this page.

If git and gpg are playing nice, you should see text resembling the following:

commit cfbbabc7326257bd15ff06245b410c16bbf66a05 (HEAD -> master, origin/master, origin/HEAD)
gpg: Signature made Tue 29 Jun 2021 01:44:23 AM GMT
gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
gpg: Good signature from "Steven Sandoval <baltakatei@gmail.com>" [ultimate]
Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
     Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
Author: Steven Baltakatei Sandoval <baltakatei@gmail.com>
Date:   2021-06-29T01:44:02+00:00

    chore(posts):sed replace baltakatei.com link (https -> http)

commit 0f280e02635f3607a95ce09e5eca6c96da302ef1
gpg: Signature made Tue 29 Jun 2021 01:36:30 AM GMT
gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
gpg: Good signature from "Steven Sandoval <baltakatei@gmail.com>" [ultimate]
Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
     Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
Author: Steven Baltakatei Sandoval <baltakatei@gmail.com>
Date:   2021-06-29T01:36:10+00:00

    feat(posts:20210629):blog post about devoe thermo textbook

commit ba29902a9234e6bf3b84362b009fd3697580d088
gpg: Signature made Tue 29 Jun 2021 01:09:20 AM GMT
gpg:                using RSA key 38F96437C83AC88E28B7A95257DA57D9517E6F86
gpg: Good signature from "Steven Sandoval <baltakatei@gmail.com>" [ultimate]
Primary key fingerprint: 3457 A265 922A 1F38 39DB  0264 A0A2 95AB DC34 69C9
     Subkey fingerprint: 38F9 6437 C83A C88E 28B7  A952 57DA 57D9 517E 6F86
Author: Steven Baltakatei Sandoval <baltakatei@gmail.com>
Date:   2021-06-29T01:09:00+00:00

    chore(posts:20210618):Add updated PDF link to glyph hunt article
Posted 2021-06-29T23:11:49+0000

TeXmacs Thermodynamics Textbook Milestone

Created by Steven Baltakatei Sandoval on 2021-06-29T01:10Z under a CC BY-SA 4.0 license and last updated on 2021-06-29T01:35Z.

For the past few months I have been transcribing a thermodynamics textbook called "Thermodynamics and Chemistry" into TeXmacs. The textbook is authored by Professor Howard DeVoe, is available online, and is licensed CC BY 4.0.

The project started out because I wanted to solve some thermodynamics problemsets in such a way that others could use my solutions to help them learn on their own. Often, example problems presented in chemistry textbooks are not very verbose for the sake of reducing publication costs of printing the final textbook. However, given the increase in remote learners due to increasingly ubiquitous internet connections (helped along by the COVID-19 Pandemic), textbooks need not be physical artifacts. I believe Libre textbooks, such as those published under Creative Commons licenses that promote sharing, are likely to become more ubiquitous as time goes on. I searched for a thermodynamics textbook that had a Creative Commons license and found the one by Professor Howard DeVoe. I created a GitLab repository where I push TeXmacs source code that others may download in order to compile their own copy of the textbook.

I chose to transcribe the textbook into TeXmacs source code instead of the currently dominant typesetting language/ecosystem of LaTeX for two reasons:

  1. I wanted a typesetting software package that students new to math equations could quickly learn in order to copy/paste/modify my work into their own homework and notes.
  2. As someone currently outside of academia, I am not required to use LaTeX.

As of today, I finished transcribing most of the textbook's fourteen chapters. I still must transcribe the Appendices, the Bibliography database, and various biographical sketches. However, my first-pass best effort has been accomplished.

My transcribed version has a PDF (latest draft here). Check the GitLab repository for updates.

I would like to thank Professor Howard DeVoe for providing the LaTeX source code and images used to build his own version of the textbook. I hope to be able to update my text as he updates his.

I'm open to collaboration via Twitter or email.

Posted 2021-06-29T01:36:10+0000

Hunt for the Dashed Vertical Bar Glyph

Created by Steven Baltakatei Sandoval on 2021-06-18T22:01Z under a CC BY-SA 4.0 license and last updated on 2021-06-29T01:08Z.

I created a TeXmacs article here (PDF) about some missing glyphs I needed to transcribe the "Galvanic Cells" chapter of DeVoe's Thermodynamics and Chemistry. My conclusion is that two new Unicode characters should be added.

I found the TeXmacs format particularly useful in this case because the subject I wrote about was a missing Unicode character needed to meet IUPAC's recommendations for drawing electrochemical cell line diagrams (see page 4 of this PDF). TeXmacs allowed me to import a custom EPS file which it then converts to PNG format when exporting my article source code into HTML format. See the article for details.

Posted 2021-06-18T22:22:36+0000

Bonneville Dam photographs

Created by Steven Baltakatei Sandoval on 2021-06-14T05:32Z under a CC BY-SA 4.0 license and last updated on 2021-06-14T05:36Z.

A few days ago I took some photographs while visiting the Bonneville Lock and Dam Visitor Center. I created a web page for them in my TeXmacs article section here.

I enjoyed playing with TeXmacs and annotating the photographs.

Posted 2021-06-14T05:37:32+0000

Kyoto Animation arson attack Wikipedia article update

Created by Steven Baltakatei Sandoval on 2021-06-07T07:03Z under a CC BY-SA 4.0 license and last updated on 2021-06-07T08:21Z.

Summary

As I have several times before, I tidied up the references in the wikicode of the Kyoto Animation arson attack article on Wikipedia. I blogged about this earlier.

If I create an article from scratch or am present when an article is being initially formed, I try to store <ref> citation (a.k.a. "references") information as a list at the end of the article's wikicode instead of as in-line references spread throughout the body text. I find the task of verifying references easier if their fields (e.g. title=, url=, access-date=, publisher=) are organized in the wikicode with indentation; the indentation makes the information more human-readable. I hope my efforts to organize references this way helps future editors verify information more quickly.

The Wikipedia template I use to store references as a list is called {{Reflist}}; the template is described here.

Background

I first learned about the arson attack back in 2019 when I was living in Mountain View, California. I have enjoyed Kyoto Animation's work (see this fan-made visual summary); the Melancholy of Haruhi Suzumiya series in particular resonated with me when I was grappling with religion and the importance of grounding yourself with rational people. I felt very strongly that the events surrounding the violent murder of Kyoto Animation (a.k.a. "KyoAni") artists should be properly referenced on Wikipedia.

I remember tweeting to encourage people who had photos of the affected building to upload those photos; I remember there were no CC BY-SA photographs of the affected building. Someone named @Thibaut managed to see a photo by a Mike Hattsu who uploaded a photograph to Twitter of the studio years before the incident. This photograph was later uploaded to Wikimedia Commons here and has been used in the Wikipedia article about the incident. I am glad whenever people upload useful information to Wikimedia Commons since the default Creative Commons license helps editors like me collaborate to build useful articles on Wikipedia that can be shared freely.

Editing Notes

In the week after the incident I remember having to make quick short edits to the article; this was because many editors were making frequent changes to the wikicode of the article that conflicted with my reflist edits which span large portions of the code. In contrast, I was able to take my time and perform all the changes in a single edit today.

I also was amused to see that some people expanded on my use of the ISO 8601 (wiki) YYYYddmm date format in some reflist entries; for example, one reference is named nhk_20190718T2124+09_deathcount. This label communicates the following about the cited resource:

  1. The resource is an NHK publication (specifically this one).
  2. The resource is dated the 18th day of July in the year 2019 at 21:24 in the UTC+09 time zone (JST).
  3. The resource is about a number of people killed.

Although I would not have chosen such a verbose timestamp string (the purpose of the YYYYddmm string is to simply help differentiate multiple publications published at different times. The ISO 8601 timestamp permits ambiguity which is useful because often I do not want to think too much about the exact second an event starts or ends; often citing the day in which an event occurs is sufficiently precise.

Future Work

The next tasks I see that should be performed for the Kyoto Animation arson attack article are (lowest-hanging fruit first):

  • Add frequently missing reflist fields (e.g. date=, last=, first=, archive-url=, archive-date=, url-status=).
  • Reformat reflist fields to use consistent order.
  • Read sources and verify semantic content.

Special Thanks

I would like to thank the Internet Archive for saving snapshots of webpages that succumb to link rot. Please donate to them; I think they are currently the best of humanity's efforts to secure the past.

Posted 2021-06-07T08:01:53+0000

FreedomBox Calendar

Created by Steven Baltakatei Sandoval on 2021-03-10T22:45Z under a CC BY-SA 4.0 license and last updated on 2021-03-12T01:50Z.

Summary

I decided to try the Radicale CalDAV software available as a FreedomBox app. My motivation for doing so was to be less reliant upon Google Calendar, the calendar and task server I have been using for years.

Background

When I first started using computer calendars, it was when I was attending university. I had barely started using Email via Gmail in my last year of high school. I remember using the calendar software in the iMac my father purchased for me; the software was very useful in helping me to keep track of when all my classes were in the fast-paced Stanford quarter system. Later, I used mostly Microsoft Outlook because my employer used it for internal meeting scheduling and correspondance. After I read Free Software, Free Society by Richard Stallman, I grew a desire to use free and open-source software whenever it was available.

To that end, one of the first major changes I made was to switch from Microsoft's Windows and (at the time) Apple's OS X to GNU/Linux Debian. I quickly learned that dual-booting was more hassle than it was worth. I found Debian was a viable substitute for every major computing activity of mine except for gaming. Therefore, I purchased a dedicated GNU/Linux machine from Think Penguin and ran Debian on that. I have been satisfied ever since.

However, some computing activities I did not switch over because I had not been using them as often. I haven't been maintaining a personal calendar because either:

  1. An scheduleable event will involve my employer so I'd default to what they require.

  2. Scheduleable events were so infrequent that I didn't think I needed to bother with even Google Calendar.

For years I have neglected to maintain a personal calendar. However, I believe it is a useful activity, especially when planning tasks for myself. I have played with Org Mode but I haven't caught on to using its task management functionality (i.e. "TODO") which appears very robust. Org Mode attempts to do as much as it can in the GNU Emacs text editor. However, I have yet to see a Emacs calendar synchronization function that I like. I'm sure one exists, but in discussion threads about what people use Emacs for, I haven't noticed such an Emacs package. So, even though I'd like to maintain a calendar and task list, Org Mode hasn't really caught my attention. That's where my experiment with FreedomBox comes in.

FreedomBox is a free and open-source project to make server software available and usable to non-sysadmin people. Although I have some experience in crafting my own Bash) scripts and messing with Debian system services via systemd commands, I have been averse to messing with web server software such as Apache. That changed after I purchased a FreedomBox device from Olimex; the device is a small ARM computer about the size of a deck of playing cards which I purchased from Olimex (the "Pioneer Edition Freedombox Home Server Kit"). It contains a set of apps, which includes a calendar server called "Radicale". It stores calendar, journal, and task data that remote clients can synchronize with; this synchornization action can occur over the public internet since FreedomBox software has been designed to provide all its apps with the ability to securely communicate using a user's own public domain name (e.g. reboil.com for this blog). This means that I can activate the Radicale app, initialize a calendar, copy the provided synchronization URL into my smartphone or computer's calendar app, and then, as a result, be able to update the calendar while I am away from home (specifically, away from the FreedomBox located there). None of this process requires that I save any data with a commercial 3rd party such as Google or Apple. With the FreedomBox, I am my own server.

Setup

The setup instructions for Radicale can be found here in the Debian wiki for the FreedomBox. Basically, the steps are:

  1. Setup the FreedomBox (Let's Encrypt certificate, Dynamic DNS if applicable, a username/password to push/pull CalDAV data)
  2. Install Radicale app (the CalDAV server) on the FreedomBox's system web interface.
  3. Login to Radicale as a FreedomBox user and create a calendar/journal/task repository (?) via the Radicale web interface.
  4. Copy the repository's (?) CalDAV URL into a client device (e.g. smartphone) app (e.g. DAVx5 for Android), Thunderbird for GNU/Linux Debian). Provide the client app with the password for a FreedomBox user.

After following the steps described in the wiki for my Android phone and my work machine's Thunderbird instances, I found I was able to successfully push a calendar event from Thunderbird and a task from Android. Now I don't have to use Google Calendar!

Summary

Using FreedomBox and Radicale, I set up my own personal digital calendar server without needing to involve Google Calendar at any step.

Posted 2021-03-10T23:56:08+0000

Personal Text Logger

Created by Steven Baltakatei Sandoval on 2021-03-09T06:11Z under a CC BY-SA 4.0 license and last updated on 2021-03-10T02:46Z.

Summary

I wrote a bash script a while ago for compressing and encrypting log text streams produced by other scripts. I call it bklog. A static copy (version 0.1.33) is available here but I version track it in an environmental sensor script repository here.

Background

Born from desire to record my surroundings

I wanted a program to write to disk the observations captured by environmental sensor data loggers. I wanted such a program to permit me to also encrypt captured sensor data against a public key in case I have it capturing sensitive data (e.g. my personal smartphone's location data) and may need to transfer the data in shared spaces (e.g. a cloud service provider or someone's computer.

The reason I wanted this program was to be able to do something with data produced by Raspberry Pi devices I have been tinkering with. I could see myself generating temperature, air pressure, location, and other trendable data. At first, I started making an adhoc bash script to record each of these items but I decided to make a general script that could compress and encrypt any text stream via stdin.

Encrypted with age

I had seen age recommended as a command line encryption tool whose main feature was that it had fewer configuration options than gnupg, a tool often used for encrypting files. I was attracted to age because it accepted input data in the form of stdin and its public keys took the form of 63-char strings. Also, the key generation process was all done in the command line. Here, secretkey.txt is an example of what a private key in age beta1 looks like:

baltakatei@mycomputer:~$ age-keygen
# created: 2021-03-09T06:32:52Z
# public key: age14a65znxam4k45kd4xg0lc8uu0yvlpyj376kap970mcxf066wycqq57xzqd
AGE-SECRET-KEY-1VEDW7UEKRCE6P2NUDEDXZF0RJ23QTND5PZP97SHHD5TG6Z7CL70Q09A4D5

As of 2021-03-09, age is in beta7. The current version of bklog, 0.1.33 assumes use of beta2.

Recently useful for recording uptime statistics

I haven't recently touched bklog in a while after getting it to work successfully in recording temperature and location data with a set of Raspberry Pi Zero W devices I played with during the 2020 COVID-19 restrictions. However, recently I did find a use for it when I decided to try and collect some uptime, bandwidth, and system process data from the server that runs this blog. I was pleased that past-me had decided to include usage information for the many options I made in the program. I wrote some bash scripts whose only purpose was to output a single stream of data continuously. For example, here's the simple script named uptime_continuous.sh for producing uptime data:

#!/bin/bash
# Desc: Outputs `uptime` every 15 seconds

while true; do
    uptime &
    sleep 15;
done;

Such a script produces output like this:

07:13:47 up  1:30,  1 user,  load average: 0.36, 0.38, 0.37
07:14:02 up  1:30,  1 user,  load average: 0.42, 0.40, 0.37
07:14:17 up  1:31,  1 user,  load average: 0.40, 0.39, 0.37

A separate bash script to be run by cron would pipe the output of this initial stream into bklog which would take care of the job of compressing, encrypting, and writing the data to disk. Because bklog was intended to run on small GNU/Linux systems such as Raspberry Pi devices that use SD cards (flash memory with a limited number of writes), it collects a buffer for a period of time (default: 10 minutes) before compressing, encrypting, and writing a separate file to a memory-only directory (default: /dev/shm). Then, this file is appended to a time-stamped output tar file (default: one tar file per day). I provided several option flags that allow on to adjust time zone, output file name patterns, time periods, etc. An example use of uptime_continuous.sh without encryption is here:

# Desc: Logs system statistics
# Note: Run at boot or every day at midnight UTC
# Depends: bklog, ifstat, top, uptime

~/.local/bin/uptime_continuous.sh | /usr/local/sbin/bklog -v -e \
  -r age14a65znxam4k45kd4xg0lc8uu0yvlpyj376kap970mcxf066wycqq57xzqd \
  -o "/home/admin/logs" -l "uptime" -w ".log" -c -z "UTC" -b "600" -B "day" \
  1>~/$(date +%s)..uptime_logger.log 2>&1 &

Here is usage information that can be obtained by running $ bklog --help:

USAGE:
    cmd | bklog [ options ]

OPTIONS:
    -h, --help
            Display help information.
    --version
            Display script version.
    -v, --verbose
            Display debugging info.
    -e, --encrypt
            Encrypt output.
    -r, --recipient [ string pubkey ]
            Specify recipient. May be age or ssh pubkey.
            May be specified multiple times for multiple pubkeys.
            See https://github.com/FiloSottile/age
    -o, --output [ path dir ]
            Specify output directory to save logs. This option is required
            to save log data.
    -p, --process-string [ filter command ] [ output file extension] 
            Specify how to create and name a processed version of the stdin.
            For example, if stdin is 'nmea' location data:

            -p "gpsbabel -i nmea -f - -o gpx -F - " ".gpx"

            This option would cause the stdin to 'bklog' to be piped into
            the 'gpsbabel' command, interpreted as 'nmea' data, converted
            into 'gpx' format, and then appended to the output tar file
            as a file with a '.gpx' extension.
            This option may be specified multiple times in order to output
            results of multiple different processing methods.
    -l, --label [ string ]
            Specify a label to be included in all output file names.
            Ex: 'location' if stdin is location data.
    -w, --store-raw [ file extension ]
            Specify file extension of file within output tar that contains
            raw stdin data. The default behavior is to always save raw stdin
            data in a '.stdin' file. Example usage when 'bklog' receives
            'nmea' data from 'gpspipe -r':

            -w ".nmea"

            Stdin data is saved in a '.nmea' file within the output tar.
    -W, --no-store-raw
            Do not store raw stdin in output tar.
    -c, --compress
            Compress output with gzip (before encryption if enabled).
    -z, --time-zone
            Specify time zone. (ex: "America/New_York")
    -t, --temp-dir [path dir]
            Specify parent directory for temporary working directory.
            Default: "/dev/shm"
    -R, --recipient-dir [path dir]
            Specify directory containing files whose first lines are
            to be interpreted as pubkey strings (see '-r' option). Only
            one directory may be specified.
    -b, --buffer-ttl [integer]
            Specify custom buffer period in seconds (default: 300 seconds)
    -B, --script-ttl [time element string]
            Specify custom script time-to-live in seconds (default: "day")
            Valid values: "day", "hour"

Here, bklog is located within /usr/local/sbin/. The output directory is specified to be /logs. Each file saved in the output tar archive contains "uptime" and ends with .log. The time zone is specified to be "UTC" (what my server uses and will be useful since I am programming the cron job to run at midnight every day). Files are written by 600 seconds. The verbose diagnostic output (optional; 1>) and any error messages (2>&1) is written to a time-stamped (UNIX epoch seconds) file in the home folder ($(date +%s)..uptime_logger.log). Files are encrypted against the age public key defined by the string "age14a65znxam4k45kd4xg0lc8uu0yvlpyj376kap970mcxf066wycqq57xzqd".

The result is a tar file named 20210309..mycomputer_uptime.gz.age.tar. The hostname mycomputer is included by default. The file's contents after about an hour are:

$ tar --list -f 20210309..mycomputer_uptime.gz.age.tar
20210309T090729+0000..VERSION
20210309T090726+0000--PT2M45S..mycomputer_uptime.log.gz.age
20210309T091011+0000--PT10M0S..mycomputer_uptime.log.gz.age
20210309T092011+0000--PT10M1S..mycomputer_uptime.log.gz.age
20210309T093012+0000--PT10M0S..mycomputer_uptime.log.gz.age
20210309T094012+0000--PT10M1S..mycomputer_uptime.log.gz.age
20210309T095013+0000--PT10M0S..mycomputer_uptime.log.gz.age
20210309T100013+0000--PT10M0S..mycomputer_uptime.log.gz.age
20210309T101013+0000--PT10M1S..mycomputer_uptime.log.gz.age
20210309T102014+0000--PT10M0S..mycomputer_uptime.log.gz.age

Each file's name includes an ISO 8601 time period before the ... I make use of the PT separator which indicates a time period. -- is a recommended by the standard as a replacement for / since / causes problems when used within UNIX file names.

VERSION files contain the version of bklog and age used as well as some other metadata useful for someone interpreting the archive.

Other scripts and commands can be used to automatically extract and reconstitute a continuous uptime file but the stream of uptime data produced by uptime_continuous.sh is all saved.

Some example commands that can decrypt the files are:

$ tar -xf 20210309..mycomputer_uptime.gz.age.tar  # extract files
$ for file in ./*.age; do
  age -d -i ~/secretkey.txt "$file" | gunzip > "${file%.gz.age}";
done;

Where secretkey.txt is the same file generated by age-keygen described earlier.

Conclusion

I found that an older script I wrote for for recording environmental sensor data was also useful in recording system statistics. I described how uptime data could be regularly produced by a custom script uptime_continuous.sh and piped into bklog for compression, encryption, and writing.

Posted 2021-03-09T11:07:19+0000

This blog is powered by ikiwiki.

Text is available under the Creative Commons Attribution-ShareAlike license; additional terms may apply.