]> pere.pagekite.me Git - homepage.git/blob - blog/archive/2023/11/11.rss
Generated.
[homepage.git] / blog / archive / 2023 / 11 / 11.rss
1 <?xml version="1.0" encoding="ISO-8859-1"?>
2 <rss version='2.0' xmlns:lj='http://www.livejournal.org/rss/lj/1.0/'>
3 <channel>
4 <title>Petter Reinholdtsen - Entries from November 2023</title>
5 <description>Entries from November 2023</description>
6 <link>https://people.skolelinux.org/pere/blog/</link>
7
8
9 <item>
10 <title>New and improved sqlcipher in Debian for accessing Signal database</title>
11 <link>https://people.skolelinux.org/pere/blog/New_and_improved_sqlcipher_in_Debian_for_accessing_Signal_database.html</link>
12 <guid isPermaLink="true">https://people.skolelinux.org/pere/blog/New_and_improved_sqlcipher_in_Debian_for_accessing_Signal_database.html</guid>
13 <pubDate>Sun, 12 Nov 2023 12:00:00 +0100</pubDate>
14 <description>&lt;p&gt;For a while now I wanted to have direct access to the
15 &lt;a href=&quot;https://signal.org/&quot;&gt;Signal&lt;/a&gt; database of messages and
16 channels of my Desktop edition of Signal. I prefer the enforced end
17 to end encryption of Signal these days for my communication with
18 friends and family, to increase the level of safety and privacy as
19 well as raising the cost of the mass surveillance government and
20 non-government entities practice these days. In August I came across
21 a nice
22 &lt;a href=&quot;https://www.yoranbrondsema.com/post/the-guide-to-extracting-statistics-from-your-signal-conversations/&quot;&gt;recipe
23 on how to use sqlcipher to extract statistics from the Signal
24 database&lt;/a&gt; explaining how to do this. Unfortunately this did not
25 work with the version of sqlcipher in Debian. The
26 &lt;a href=&quot;http://tracker.debian.org/sqlcipher/&quot;&gt;sqlcipher&lt;/a&gt;
27 package is a &quot;fork&quot; of the sqlite package with added support for
28 encrypted databases. Sadly the current Debian maintainer
29 &lt;a href=&quot;https://bugs.debian.org/961598&quot;&gt;announced more than three
30 years ago that he did not have time to maintain sqlcipher&lt;/a&gt;, so it
31 seemed unlikely to be upgraded by the maintainer. I was reluctant to
32 take on the job myself, as I have very limited experience maintaining
33 shared libraries in Debian. After waiting and hoping for a few
34 months, I gave up the last week, and set out to update the package. In
35 the process I orphaned it to make it more obvious for the next person
36 looking at it that the package need proper maintenance.&lt;/p&gt;
37
38 &lt;p&gt;The version in Debian was around five years old, and quite a lot of
39 changes had taken place upstream into the Debian maintenance git
40 repository. After spending a few days importing the new upstream
41 versions, realising that upstream did not care much for SONAME
42 versioning as I saw library symbols being both added and removed with
43 minor version number changes to the project, I concluded that I had to
44 do a SONAME bump of the library package to avoid surprising the
45 reverse dependencies. I even added a simple
46 autopkgtest script to ensure the package work as intended. Dug deep
47 into the hole of learning shared library maintenance, I set out a few
48 days ago to upload the new version to Debian experimental to see what
49 the quality assurance framework in Debian had to say about the result.
50 The feedback told me the pacakge was not too shabby, and yesterday I
51 uploaded the latest version to Debian unstable. It should enter
52 testing today or tomorrow, perhaps delayed by
53 &lt;a href=&quot;https://bugs.debian.org/1055812&quot;&gt;a small library
54 transition&lt;/a&gt;.&lt;/p&gt;
55
56 &lt;p&gt;Armed with a new version of sqlcipher, I can now have a look at the
57 SQL database in ~/.config/Signal/sql/db.sqlite. First, one need to
58 fetch the encryption key from the Signal configuration using this
59 simple JSON extraction command:&lt;/p&gt;
60
61 &lt;pre&gt;/usr/bin/jq -r &#39;.&quot;key&quot;&#39; ~/.config/Signal/config.json&lt;/pre&gt;
62
63 &lt;p&gt;Assuming the result from that command is &#39;secretkey&#39;, which is a
64 hexadecimal number representing the key used to encrypt the database.
65 Next, one can now connect to the database and inject the encryption
66 key for access via SQL to fetch information from the database. Here
67 is an example dumping the database structure:&lt;/p&gt;
68
69 &lt;pre&gt;
70 % sqlcipher ~/.config/Signal/sql/db.sqlite
71 sqlite&gt; PRAGMA key = &quot;x&#39;secretkey&#39;&quot;;
72 sqlite&gt; .schema
73 CREATE TABLE sqlite_stat1(tbl,idx,stat);
74 CREATE TABLE conversations(
75 id STRING PRIMARY KEY ASC,
76 json TEXT,
77
78 active_at INTEGER,
79 type STRING,
80 members TEXT,
81 name TEXT,
82 profileName TEXT
83 , profileFamilyName TEXT, profileFullName TEXT, e164 TEXT, serviceId TEXT, groupId TEXT, profileLastFetchedAt INTEGER);
84 CREATE TABLE identityKeys(
85 id STRING PRIMARY KEY ASC,
86 json TEXT
87 );
88 CREATE TABLE items(
89 id STRING PRIMARY KEY ASC,
90 json TEXT
91 );
92 CREATE TABLE sessions(
93 id TEXT PRIMARY KEY,
94 conversationId TEXT,
95 json TEXT
96 , ourServiceId STRING, serviceId STRING);
97 CREATE TABLE attachment_downloads(
98 id STRING primary key,
99 timestamp INTEGER,
100 pending INTEGER,
101 json TEXT
102 );
103 CREATE TABLE sticker_packs(
104 id TEXT PRIMARY KEY,
105 key TEXT NOT NULL,
106
107 author STRING,
108 coverStickerId INTEGER,
109 createdAt INTEGER,
110 downloadAttempts INTEGER,
111 installedAt INTEGER,
112 lastUsed INTEGER,
113 status STRING,
114 stickerCount INTEGER,
115 title STRING
116 , attemptedStatus STRING, position INTEGER DEFAULT 0 NOT NULL, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync
117 INTEGER DEFAULT 0 NOT NULL);
118 CREATE TABLE stickers(
119 id INTEGER NOT NULL,
120 packId TEXT NOT NULL,
121
122 emoji STRING,
123 height INTEGER,
124 isCoverOnly INTEGER,
125 lastUsed INTEGER,
126 path STRING,
127 width INTEGER,
128
129 PRIMARY KEY (id, packId),
130 CONSTRAINT stickers_fk
131 FOREIGN KEY (packId)
132 REFERENCES sticker_packs(id)
133 ON DELETE CASCADE
134 );
135 CREATE TABLE sticker_references(
136 messageId STRING,
137 packId TEXT,
138 CONSTRAINT sticker_references_fk
139 FOREIGN KEY(packId)
140 REFERENCES sticker_packs(id)
141 ON DELETE CASCADE
142 );
143 CREATE TABLE emojis(
144 shortName TEXT PRIMARY KEY,
145 lastUsage INTEGER
146 );
147 CREATE TABLE messages(
148 rowid INTEGER PRIMARY KEY ASC,
149 id STRING UNIQUE,
150 json TEXT,
151 readStatus INTEGER,
152 expires_at INTEGER,
153 sent_at INTEGER,
154 schemaVersion INTEGER,
155 conversationId STRING,
156 received_at INTEGER,
157 source STRING,
158 hasAttachments INTEGER,
159 hasFileAttachments INTEGER,
160 hasVisualMediaAttachments INTEGER,
161 expireTimer INTEGER,
162 expirationStartTimestamp INTEGER,
163 type STRING,
164 body TEXT,
165 messageTimer INTEGER,
166 messageTimerStart INTEGER,
167 messageTimerExpiresAt INTEGER,
168 isErased INTEGER,
169 isViewOnce INTEGER,
170 sourceServiceId TEXT, serverGuid STRING NULL, sourceDevice INTEGER, storyId STRING, isStory INTEGER
171 GENERATED ALWAYS AS (type IS &#39;story&#39;), isChangeCreatedByUs INTEGER NOT NULL DEFAULT 0, isTimerChangeFromSync INTEGER
172 GENERATED ALWAYS AS (
173 json_extract(json, &#39;$.expirationTimerUpdate.fromSync&#39;) IS 1
174 ), seenStatus NUMBER default 0, storyDistributionListId STRING, expiresAt INT
175 GENERATED ALWAYS
176 AS (ifnull(
177 expirationStartTimestamp + (expireTimer * 1000),
178 9007199254740991
179 )), shouldAffectActivity INTEGER
180 GENERATED ALWAYS AS (
181 type IS NULL
182 OR
183 type NOT IN (
184 &#39;change-number-notification&#39;,
185 &#39;contact-removed-notification&#39;,
186 &#39;conversation-merge&#39;,
187 &#39;group-v1-migration&#39;,
188 &#39;keychange&#39;,
189 &#39;message-history-unsynced&#39;,
190 &#39;profile-change&#39;,
191 &#39;story&#39;,
192 &#39;universal-timer-notification&#39;,
193 &#39;verified-change&#39;
194 )
195 ), shouldAffectPreview INTEGER
196 GENERATED ALWAYS AS (
197 type IS NULL
198 OR
199 type NOT IN (
200 &#39;change-number-notification&#39;,
201 &#39;contact-removed-notification&#39;,
202 &#39;conversation-merge&#39;,
203 &#39;group-v1-migration&#39;,
204 &#39;keychange&#39;,
205 &#39;message-history-unsynced&#39;,
206 &#39;profile-change&#39;,
207 &#39;story&#39;,
208 &#39;universal-timer-notification&#39;,
209 &#39;verified-change&#39;
210 )
211 ), isUserInitiatedMessage INTEGER
212 GENERATED ALWAYS AS (
213 type IS NULL
214 OR
215 type NOT IN (
216 &#39;change-number-notification&#39;,
217 &#39;contact-removed-notification&#39;,
218 &#39;conversation-merge&#39;,
219 &#39;group-v1-migration&#39;,
220 &#39;group-v2-change&#39;,
221 &#39;keychange&#39;,
222 &#39;message-history-unsynced&#39;,
223 &#39;profile-change&#39;,
224 &#39;story&#39;,
225 &#39;universal-timer-notification&#39;,
226 &#39;verified-change&#39;
227 )
228 ), mentionsMe INTEGER NOT NULL DEFAULT 0, isGroupLeaveEvent INTEGER
229 GENERATED ALWAYS AS (
230 type IS &#39;group-v2-change&#39; AND
231 json_array_length(json_extract(json, &#39;$.groupV2Change.details&#39;)) IS 1 AND
232 json_extract(json, &#39;$.groupV2Change.details[0].type&#39;) IS &#39;member-remove&#39; AND
233 json_extract(json, &#39;$.groupV2Change.from&#39;) IS NOT NULL AND
234 json_extract(json, &#39;$.groupV2Change.from&#39;) IS json_extract(json, &#39;$.groupV2Change.details[0].aci&#39;)
235 ), isGroupLeaveEventFromOther INTEGER
236 GENERATED ALWAYS AS (
237 isGroupLeaveEvent IS 1
238 AND
239 isChangeCreatedByUs IS 0
240 ), callId TEXT
241 GENERATED ALWAYS AS (
242 json_extract(json, &#39;$.callId&#39;)
243 ));
244 CREATE TABLE sqlite_stat4(tbl,idx,neq,nlt,ndlt,sample);
245 CREATE TABLE jobs(
246 id TEXT PRIMARY KEY,
247 queueType TEXT STRING NOT NULL,
248 timestamp INTEGER NOT NULL,
249 data STRING TEXT
250 );
251 CREATE TABLE reactions(
252 conversationId STRING,
253 emoji STRING,
254 fromId STRING,
255 messageReceivedAt INTEGER,
256 targetAuthorAci STRING,
257 targetTimestamp INTEGER,
258 unread INTEGER
259 , messageId STRING);
260 CREATE TABLE senderKeys(
261 id TEXT PRIMARY KEY NOT NULL,
262 senderId TEXT NOT NULL,
263 distributionId TEXT NOT NULL,
264 data BLOB NOT NULL,
265 lastUpdatedDate NUMBER NOT NULL
266 );
267 CREATE TABLE unprocessed(
268 id STRING PRIMARY KEY ASC,
269 timestamp INTEGER,
270 version INTEGER,
271 attempts INTEGER,
272 envelope TEXT,
273 decrypted TEXT,
274 source TEXT,
275 serverTimestamp INTEGER,
276 sourceServiceId STRING
277 , serverGuid STRING NULL, sourceDevice INTEGER, receivedAtCounter INTEGER, urgent INTEGER, story INTEGER);
278 CREATE TABLE sendLogPayloads(
279 id INTEGER PRIMARY KEY ASC,
280
281 timestamp INTEGER NOT NULL,
282 contentHint INTEGER NOT NULL,
283 proto BLOB NOT NULL
284 , urgent INTEGER, hasPniSignatureMessage INTEGER DEFAULT 0 NOT NULL);
285 CREATE TABLE sendLogRecipients(
286 payloadId INTEGER NOT NULL,
287
288 recipientServiceId STRING NOT NULL,
289 deviceId INTEGER NOT NULL,
290
291 PRIMARY KEY (payloadId, recipientServiceId, deviceId),
292
293 CONSTRAINT sendLogRecipientsForeignKey
294 FOREIGN KEY (payloadId)
295 REFERENCES sendLogPayloads(id)
296 ON DELETE CASCADE
297 );
298 CREATE TABLE sendLogMessageIds(
299 payloadId INTEGER NOT NULL,
300
301 messageId STRING NOT NULL,
302
303 PRIMARY KEY (payloadId, messageId),
304
305 CONSTRAINT sendLogMessageIdsForeignKey
306 FOREIGN KEY (payloadId)
307 REFERENCES sendLogPayloads(id)
308 ON DELETE CASCADE
309 );
310 CREATE TABLE preKeys(
311 id STRING PRIMARY KEY ASC,
312 json TEXT
313 , ourServiceId NUMBER
314 GENERATED ALWAYS AS (json_extract(json, &#39;$.ourServiceId&#39;)));
315 CREATE TABLE signedPreKeys(
316 id STRING PRIMARY KEY ASC,
317 json TEXT
318 , ourServiceId NUMBER
319 GENERATED ALWAYS AS (json_extract(json, &#39;$.ourServiceId&#39;)));
320 CREATE TABLE badges(
321 id TEXT PRIMARY KEY,
322 category TEXT NOT NULL,
323 name TEXT NOT NULL,
324 descriptionTemplate TEXT NOT NULL
325 );
326 CREATE TABLE badgeImageFiles(
327 badgeId TEXT REFERENCES badges(id)
328 ON DELETE CASCADE
329 ON UPDATE CASCADE,
330 &#39;order&#39; INTEGER NOT NULL,
331 url TEXT NOT NULL,
332 localPath TEXT,
333 theme TEXT NOT NULL
334 );
335 CREATE TABLE storyReads (
336 authorId STRING NOT NULL,
337 conversationId STRING NOT NULL,
338 storyId STRING NOT NULL,
339 storyReadDate NUMBER NOT NULL,
340
341 PRIMARY KEY (authorId, storyId)
342 );
343 CREATE TABLE storyDistributions(
344 id STRING PRIMARY KEY NOT NULL,
345 name TEXT,
346
347 senderKeyInfoJson STRING
348 , deletedAtTimestamp INTEGER, allowsReplies INTEGER, isBlockList INTEGER, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync INTEGER);
349 CREATE TABLE storyDistributionMembers(
350 listId STRING NOT NULL REFERENCES storyDistributions(id)
351 ON DELETE CASCADE
352 ON UPDATE CASCADE,
353 serviceId STRING NOT NULL,
354
355 PRIMARY KEY (listId, serviceId)
356 );
357 CREATE TABLE uninstalled_sticker_packs (
358 id STRING NOT NULL PRIMARY KEY,
359 uninstalledAt NUMBER NOT NULL,
360 storageID STRING,
361 storageVersion NUMBER,
362 storageUnknownFields BLOB,
363 storageNeedsSync INTEGER NOT NULL
364 );
365 CREATE TABLE groupCallRingCancellations(
366 ringId INTEGER PRIMARY KEY,
367 createdAt INTEGER NOT NULL
368 );
369 CREATE TABLE IF NOT EXISTS &#39;messages_fts_data&#39;(id INTEGER PRIMARY KEY, block BLOB);
370 CREATE TABLE IF NOT EXISTS &#39;messages_fts_idx&#39;(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID;
371 CREATE TABLE IF NOT EXISTS &#39;messages_fts_content&#39;(id INTEGER PRIMARY KEY, c0);
372 CREATE TABLE IF NOT EXISTS &#39;messages_fts_docsize&#39;(id INTEGER PRIMARY KEY, sz BLOB);
373 CREATE TABLE IF NOT EXISTS &#39;messages_fts_config&#39;(k PRIMARY KEY, v) WITHOUT ROWID;
374 CREATE TABLE edited_messages(
375 messageId STRING REFERENCES messages(id)
376 ON DELETE CASCADE,
377 sentAt INTEGER,
378 readStatus INTEGER
379 , conversationId STRING);
380 CREATE TABLE mentions (
381 messageId REFERENCES messages(id) ON DELETE CASCADE,
382 mentionAci STRING,
383 start INTEGER,
384 length INTEGER
385 );
386 CREATE TABLE kyberPreKeys(
387 id STRING PRIMARY KEY NOT NULL,
388 json TEXT NOT NULL, ourServiceId NUMBER
389 GENERATED ALWAYS AS (json_extract(json, &#39;$.ourServiceId&#39;)));
390 CREATE TABLE callsHistory (
391 callId TEXT PRIMARY KEY,
392 peerId TEXT NOT NULL, -- conversation id (legacy) | uuid | groupId | roomId
393 ringerId TEXT DEFAULT NULL, -- ringer uuid
394 mode TEXT NOT NULL, -- enum &quot;Direct&quot; | &quot;Group&quot;
395 type TEXT NOT NULL, -- enum &quot;Audio&quot; | &quot;Video&quot; | &quot;Group&quot;
396 direction TEXT NOT NULL, -- enum &quot;Incoming&quot; | &quot;Outgoing
397 -- Direct: enum &quot;Pending&quot; | &quot;Missed&quot; | &quot;Accepted&quot; | &quot;Deleted&quot;
398 -- Group: enum &quot;GenericGroupCall&quot; | &quot;OutgoingRing&quot; | &quot;Ringing&quot; | &quot;Joined&quot; | &quot;Missed&quot; | &quot;Declined&quot; | &quot;Accepted&quot; | &quot;Deleted&quot;
399 status TEXT NOT NULL,
400 timestamp INTEGER NOT NULL,
401 UNIQUE (callId, peerId) ON CONFLICT FAIL
402 );
403 [ dropped all indexes to save space in this blog post ]
404 CREATE TRIGGER messages_on_view_once_update AFTER UPDATE ON messages
405 WHEN
406 new.body IS NOT NULL AND new.isViewOnce = 1
407 BEGIN
408 DELETE FROM messages_fts WHERE rowid = old.rowid;
409 END;
410 CREATE TRIGGER messages_on_insert AFTER INSERT ON messages
411 WHEN new.isViewOnce IS NOT 1 AND new.storyId IS NULL
412 BEGIN
413 INSERT INTO messages_fts
414 (rowid, body)
415 VALUES
416 (new.rowid, new.body);
417 END;
418 CREATE TRIGGER messages_on_delete AFTER DELETE ON messages BEGIN
419 DELETE FROM messages_fts WHERE rowid = old.rowid;
420 DELETE FROM sendLogPayloads WHERE id IN (
421 SELECT payloadId FROM sendLogMessageIds
422 WHERE messageId = old.id
423 );
424 DELETE FROM reactions WHERE rowid IN (
425 SELECT rowid FROM reactions
426 WHERE messageId = old.id
427 );
428 DELETE FROM storyReads WHERE storyId = old.storyId;
429 END;
430 CREATE VIRTUAL TABLE messages_fts USING fts5(
431 body,
432 tokenize = &#39;signal_tokenizer&#39;
433 );
434 CREATE TRIGGER messages_on_update AFTER UPDATE ON messages
435 WHEN
436 (new.body IS NULL OR old.body IS NOT new.body) AND
437 new.isViewOnce IS NOT 1 AND new.storyId IS NULL
438 BEGIN
439 DELETE FROM messages_fts WHERE rowid = old.rowid;
440 INSERT INTO messages_fts
441 (rowid, body)
442 VALUES
443 (new.rowid, new.body);
444 END;
445 CREATE TRIGGER messages_on_insert_insert_mentions AFTER INSERT ON messages
446 BEGIN
447 INSERT INTO mentions (messageId, mentionAci, start, length)
448
449 SELECT messages.id, bodyRanges.value -&gt;&gt; &#39;mentionAci&#39; as mentionAci,
450 bodyRanges.value -&gt;&gt; &#39;start&#39; as start,
451 bodyRanges.value -&gt;&gt; &#39;length&#39; as length
452 FROM messages, json_each(messages.json -&gt;&gt; &#39;bodyRanges&#39;) as bodyRanges
453 WHERE bodyRanges.value -&gt;&gt; &#39;mentionAci&#39; IS NOT NULL
454
455 AND messages.id = new.id;
456 END;
457 CREATE TRIGGER messages_on_update_update_mentions AFTER UPDATE ON messages
458 BEGIN
459 DELETE FROM mentions WHERE messageId = new.id;
460 INSERT INTO mentions (messageId, mentionAci, start, length)
461
462 SELECT messages.id, bodyRanges.value -&gt;&gt; &#39;mentionAci&#39; as mentionAci,
463 bodyRanges.value -&gt;&gt; &#39;start&#39; as start,
464 bodyRanges.value -&gt;&gt; &#39;length&#39; as length
465 FROM messages, json_each(messages.json -&gt;&gt; &#39;bodyRanges&#39;) as bodyRanges
466 WHERE bodyRanges.value -&gt;&gt; &#39;mentionAci&#39; IS NOT NULL
467
468 AND messages.id = new.id;
469 END;
470 sqlite&gt;
471 &lt;/pre&gt;
472
473 &lt;p&gt;Finally I have the tool needed to inspect and process Signal
474 messages that I need, without using the vendor provided client. Now
475 on to transforming it to a more useful format.&lt;/p&gt;
476
477 &lt;p&gt;As usual, if you use Bitcoin and want to show your support of my
478 activities, please send Bitcoin donations to my address
479 &lt;b&gt;&lt;a href=&quot;bitcoin:15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&quot;&gt;15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&lt;/a&gt;&lt;/b&gt;.&lt;/p&gt;
480 </description>
481 </item>
482
483 <item>
484 <title>New chrpath release 0.17</title>
485 <link>https://people.skolelinux.org/pere/blog/New_chrpath_release_0_17.html</link>
486 <guid isPermaLink="true">https://people.skolelinux.org/pere/blog/New_chrpath_release_0_17.html</guid>
487 <pubDate>Fri, 10 Nov 2023 07:30:00 +0100</pubDate>
488 <description>&lt;p&gt;The chrpath package provide a simple command line tool to remove or
489 modify the rpath or runpath of compiled ELF program. It is almost 10
490 years since I updated the code base, but I stumbled over the tool
491 today, and decided it was time to move the code base from Subversion
492 to git and find a new home for it, as the previous one (Debian Alioth)
493 has been shut down. I decided to go with
494 &lt;a href=&quot;https://codeberg.org/&quot;&gt;Codeberg&lt;/a&gt; this time, as it is my git
495 service of choice these days, did a quick and dirty migration to git
496 and updated the code with a few patches I found in the Debian bug
497 tracker. These are the release notes:&lt;/p&gt;
498
499 &lt;p&gt;New in 0.17 released 2023-11-10:&lt;/p&gt;
500
501 &lt;ul&gt;
502 &lt;li&gt;Moved project to Codeberg, as Alioth is shut down.&lt;/li&gt;
503 &lt;li&gt;Add Solaris support (use &amp;lt;sys/byteorder.h&gt; instead of &amp;lt;byteswap.h&gt;).
504 Patch from Rainer Orth.&lt;/li&gt;
505 &lt;li&gt;Added missing newline from printf() line. Patch from Frank Dana.&lt;/li&gt;
506 &lt;li&gt;Corrected handling of multiple ELF sections. Patch from Frank Dana.&lt;/li&gt;
507 &lt;li&gt;Updated build rules for .deb. Partly based on patch from djcj.&lt;/li&gt;
508 &lt;/ul&gt;
509
510 &lt;p&gt;The latest edition is tagged and available from
511 &lt;a href=&quot;https://codeberg.org/pere/chrpath&quot;&gt;https://codeberg.org/pere/chrpath&lt;/a&gt;.
512
513 &lt;p&gt;As usual, if you use Bitcoin and want to show your support of my
514 activities, please send Bitcoin donations to my address
515 &lt;b&gt;&lt;a href=&quot;bitcoin:15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&quot;&gt;15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&lt;/a&gt;&lt;/b&gt;.&lt;/p&gt;
516 </description>
517 </item>
518
519 <item>
520 <title>Test framework for DocBook processors / formatters</title>
521 <link>https://people.skolelinux.org/pere/blog/Test_framework_for_DocBook_processors___formatters.html</link>
522 <guid isPermaLink="true">https://people.skolelinux.org/pere/blog/Test_framework_for_DocBook_processors___formatters.html</guid>
523 <pubDate>Sun, 5 Nov 2023 13:00:00 +0100</pubDate>
524 <description>&lt;p&gt;All the books I have published so far has been using
525 &lt;a href=&quot;https://docbook.org/&quot;&gt;DocBook&lt;/a&gt; somewhere in the process.
526 For the first book, the source format was DocBook, while for every
527 later book it was an intermediate format used as the stepping stone to
528 be able to present the same manuscript in several formats, on paper,
529 as ebook in ePub format, as a HTML page and as a PDF file either for
530 paper production or for Internet consumption. This is made possible
531 with a wide variety of free software tools with DocBook support in
532 Debian. The source format of later books have been docx via rst,
533 Markdown, Filemaker and Asciidoc, and for all of these I was able to
534 generate a suitable DocBook file for further processing using
535 &lt;a href=&quot;https://tracker.debian.org/pkg/pandoc&quot;&gt;pandoc&lt;/a&gt;,
536 &lt;a href=&quot;https://tracker.debian.org/pkg/asciidoc&quot;&gt;a2x&lt;/a&gt; and
537 &lt;a href=&quot;https://tracker.debian.org/pkg/asciidoctor&quot;&gt;asciidoctor&lt;/a&gt;,
538 as well as rendering using
539 &lt;a href=&quot;https://tracker.debian.org/pkg/xmlto&quot;&gt;xmlto&lt;/a&gt;,
540 &lt;a href=&quot;https://tracker.debian.org/pkg/dbtoepub&quot;&gt;dbtoepub&lt;/a&gt;,
541 &lt;a href=&quot;https://tracker.debian.org/pkg/dblatex&quot;&gt;dblatex&lt;/a&gt;,
542 &lt;a href=&quot;https://tracker.debian.org/pkg/docbook-xsl&quot;&gt;docbook-xsl&lt;/a&gt; and
543 &lt;a href=&quot;https://tracker.debian.org/pkg/fop&quot;&gt;fop&lt;/a&gt;.&lt;/p&gt;
544
545 &lt;p&gt;Most of the &lt;a href=&quot;http://www.hungry.com/~pere/publisher/&quot;&gt;books I
546 have published&lt;/a&gt; are translated books, with English as the source
547 language. The use of
548 &lt;a href=&quot;https://tracker.debian.org/pkg/po4a&quot;&gt;po4a&lt;/a&gt; to
549 handle translations using the gettext PO format has been a blessing,
550 but publishing translated books had triggered the need to ensure the
551 DocBook tools handle relevant languages correctly. For every new
552 language I have published, I had to submit patches dblatex, dbtoepub
553 and docbook-xsl fixing incorrect language and country specific issues
554 in the framework themselves. Typically this has been missing keywords
555 like &#39;figure&#39; or sort ordering of index entries. After a while it
556 became tiresome to only discover issues like this by accident, and I
557 decided to write a DocBook &quot;test framework&quot; exercising various
558 features of DocBook and allowing me to see all features exercised for
559 a given language. It consist of a set of DocBook files, a version 4
560 book, a version 5 book, a v4 book set, a v4 selection of problematic
561 tables, one v4 testing sidefloat and finally one v4 testing a book of
562 articles. The DocBook files are accompanied with a set of build rules
563 for building PDF using dblatex and docbook-xsl/fop, HTML using xmlto
564 or docbook-xsl and epub using dbtoepub. The result is a set of files
565 visualizing footnotes, indexes, table of content list, figures,
566 formulas and other DocBook features, allowing for a quick review on
567 the completeness of the given locale settings. To build with a
568 different language setting, all one need to do is edit the lang= value
569 in the .xml file to pick a different ISO 639 code value and run
570 &#39;make&#39;.&lt;/p&gt;
571
572 &lt;p&gt;The &lt;a href=&quot;https://codeberg.org/pere/docbook-example/&quot;&gt;test framework
573 source code&lt;/a&gt; is available from Codeberg, and a generated set of
574 presentations of the various examples is available as Codeberg static
575 web pages at
576 &lt;a href=&quot;https://pere.codeberg.page/docbook-example/&quot;&gt;https://pere.codeberg.page/docbook-example/&lt;/a&gt;.
577 Using this test framework I have been able to discover and report
578 several bugs and missing features in various tools, and got a lot of
579 them fixed. For example I got Northern Sami keywords added to both
580 docbook-xsl and dblatex, fixed several typos in Norwegian bokmål and
581 Norwegian Nynorsk, support for non-ascii title IDs added to pandoc,
582 Norwegian index sorting support fixed in xindy and initial Norwegian
583 Bokmål support added to dblatex. Some issues still remains, though.
584 Default index sorting rules are still broken in several tools, so the
585 Norwegian letters æ, ø and å are more often than not sorted properly
586 in the book index.&lt;/p&gt;
587
588 &lt;p&gt;The test framework recently received some more polish, as part of
589 publishing my latest book. This book contained a lot of fairly
590 complex tables, which exposed bugs in some of the tools. This made me
591 add a new test file with various tables, as well as spend some time to
592 brush up the build rules. My goal is for the test framework to
593 exercise all DocBook features to make it easier to see which features
594 work with different processors, and hopefully get them all to support
595 the full set of DocBook features. Feel free to send patches to extend
596 the test set, and test it with your favorite DocBook processor.
597 Please visit these two URLs to learn more:&lt;/p&gt;
598
599 &lt;ul&gt;
600 &lt;li&gt;&lt;a href=&quot;https://codeberg.org/pere/docbook-example/&quot;&gt;https://codeberg.org/pere/docbook-example/&lt;/a&gt;&lt;/li&gt;
601 &lt;li&gt;&lt;a href=&quot;https://pere.codeberg.page/docbook-example/&quot;&gt;https://pere.codeberg.page/docbook-example/&lt;/a&gt;&lt;/li&gt;
602 &lt;/ul&gt;
603
604 &lt;p&gt;If you want to learn more on Docbook and translations, I recommend
605 having a look at the &lt;a href=&quot;https://docbook.org/&quot;&gt;the DocBook
606 web site&lt;/a&gt;,
607 &lt;a href=&quot;https://doccookbook.sourceforge.net/html/en/&quot;&gt;the DoCookBook
608 site&lt;a/&gt; and my earlier blog post on
609 &lt;a href=&quot;https://people.skolelinux.org/pere/blog/From_English_wiki_to_translated_PDF_and_epub_via_Docbook.html&quot;&gt;how
610 the Skolelinux project process and translate documentation&lt;/a&gt;, a talk I gave earlier this year on
611 &lt;a href=&quot;https://www.nuug.no/aktiviteter/20230314-oversetting-og-publisering-av-b%c3%b8ker-med-fri-programvare/&quot;&gt;how
612 to translate and publish books using free software&lt;/a&gt; (Norwegian
613 only).&lt;/p&gt;
614
615 &lt;!--
616
617 https://github.com/docbook/xslt10-stylesheets/issues/205 (docbook-xsl: sme support)
618 https://bugs.debian.org/968437 (xindy: index sorting rules for nb/nn)
619 https://bugs.debian.org/856123 (pandoc: markdown to docbook with non-english titles)
620 https://bugs.debian.org/864813 (dblatex: missing nb words)
621 https://bugs.debian.org/756386 (dblatex: index sorting rules for nb/nn)
622 https://bugs.debian.org/796871 (dbtoepub: index sorting rules for nb/nn)
623 https://bugs.debian.org/792616 (dblatex: PDF metadata)
624 https://bugs.debian.org/686908 (docbook-xsl: index sorting rules for nb/nn)
625 https://sourceforge.net/tracker/?func=detail&amp;atid=373747&amp;aid=3556630&amp;group_id=21935 (docbook-xsl: nb/nn support)
626 https://bugs.debian.org/684391 (dblatex: initial nb support)
627
628 --&gt;
629
630 &lt;p&gt;As usual, if you use Bitcoin and want to show your support of my
631 activities, please send Bitcoin donations to my address
632 &lt;b&gt;&lt;a href=&quot;bitcoin:15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&quot;&gt;15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b&lt;/a&gt;&lt;/b&gt;.&lt;/p&gt;
633 </description>
634 </item>
635
636 </channel>
637 </rss>