1 <?xml version=
"1.0" encoding=
"ISO-8859-1"?>
2 <rss version='
2.0' xmlns:lj='http://www.livejournal.org/rss/lj/
1.0/'
>
4 <title>Petter Reinholdtsen - Entries from November
2023</title>
5 <description>Entries from November
2023</description>
6 <link>https://people.skolelinux.org/pere/blog/
</link>
10 <title>New and improved sqlcipher in Debian for accessing Signal database
</title>
11 <link>https://people.skolelinux.org/pere/blog/New_and_improved_sqlcipher_in_Debian_for_accessing_Signal_database.html
</link>
12 <guid isPermaLink=
"true">https://people.skolelinux.org/pere/blog/New_and_improved_sqlcipher_in_Debian_for_accessing_Signal_database.html
</guid>
13 <pubDate>Sun,
12 Nov
2023 12:
00:
00 +
0100</pubDate>
14 <description><p
>For a while now I wanted to have direct access to the
15 <a href=
"https://signal.org/
">Signal
</a
> database of messages and
16 channels of my Desktop edition of Signal. I prefer the enforced end
17 to end encryption of Signal these days for my communication with
18 friends and family, to increase the level of safety and privacy as
19 well as raising the cost of the mass surveillance government and
20 non-government entities practice these days. In August I came across
22 <a href=
"https://www.yoranbrondsema.com/post/the-guide-to-extracting-statistics-from-your-signal-conversations/
">recipe
23 on how to use sqlcipher to extract statistics from the Signal
24 database
</a
> explaining how to do this. Unfortunately this did not
25 work with the version of sqlcipher in Debian. The
26 <a href=
"http://tracker.debian.org/sqlcipher/
">sqlcipher
</a
>
27 package is a
"fork
" of the sqlite package with added support for
28 encrypted databases. Sadly the current Debian maintainer
29 <a href=
"https://bugs.debian.org/
961598">announced more than three
30 years ago that he did not have time to maintain sqlcipher
</a
>, so it
31 seemed unlikely to be upgraded by the maintainer. I was reluctant to
32 take on the job myself, as I have very limited experience maintaining
33 shared libraries in Debian. After waiting and hoping for a few
34 months, I gave up the last week, and set out to update the package. In
35 the process I orphaned it to make it more obvious for the next person
36 looking at it that the package need proper maintenance.
</p
>
38 <p
>The version in Debian was around five years old, and quite a lot of
39 changes had taken place upstream into the Debian maintenance git
40 repository. After spending a few days importing the new upstream
41 versions, realising that upstream did not care much for SONAME
42 versioning as I saw library symbols being both added and removed with
43 minor version number changes to the project, I concluded that I had to
44 do a SONAME bump of the library package to avoid surprising the
45 reverse dependencies. I even added a simple
46 autopkgtest script to ensure the package work as intended. Dug deep
47 into the hole of learning shared library maintenance, I set out a few
48 days ago to upload the new version to Debian experimental to see what
49 the quality assurance framework in Debian had to say about the result.
50 The feedback told me the pacakge was not too shabby, and yesterday I
51 uploaded the latest version to Debian unstable. It should enter
52 testing today or tomorrow, perhaps delayed by
53 <a href=
"https://bugs.debian.org/
1055812">a small library
54 transition
</a
>.
</p
>
56 <p
>Armed with a new version of sqlcipher, I can now have a look at the
57 SQL database in ~/.config/Signal/sql/db.sqlite. First, one need to
58 fetch the encryption key from the Signal configuration using this
59 simple JSON extraction command:
</p
>
61 <pre
>/usr/bin/jq -r
'.
"key
"' ~/.config/Signal/config.json
</pre
>
63 <p
>Assuming the result from that command is
'secretkey
', which is a
64 hexadecimal number representing the key used to encrypt the database.
65 Next, one can now connect to the database and inject the encryption
66 key for access via SQL to fetch information from the database. Here
67 is an example dumping the database structure:
</p
>
70 % sqlcipher ~/.config/Signal/sql/db.sqlite
71 sqlite
> PRAGMA key =
"x
'secretkey
'";
73 CREATE TABLE sqlite_stat1(tbl,idx,stat);
74 CREATE TABLE conversations(
75 id STRING PRIMARY KEY ASC,
83 , profileFamilyName TEXT, profileFullName TEXT, e164 TEXT, serviceId TEXT, groupId TEXT, profileLastFetchedAt INTEGER);
84 CREATE TABLE identityKeys(
85 id STRING PRIMARY KEY ASC,
89 id STRING PRIMARY KEY ASC,
92 CREATE TABLE sessions(
96 , ourServiceId STRING, serviceId STRING);
97 CREATE TABLE attachment_downloads(
98 id STRING primary key,
103 CREATE TABLE sticker_packs(
108 coverStickerId INTEGER,
110 downloadAttempts INTEGER,
114 stickerCount INTEGER,
116 , attemptedStatus STRING, position INTEGER DEFAULT
0 NOT NULL, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync
117 INTEGER DEFAULT
0 NOT NULL);
118 CREATE TABLE stickers(
120 packId TEXT NOT NULL,
129 PRIMARY KEY (id, packId),
130 CONSTRAINT stickers_fk
132 REFERENCES sticker_packs(id)
135 CREATE TABLE sticker_references(
138 CONSTRAINT sticker_references_fk
140 REFERENCES sticker_packs(id)
144 shortName TEXT PRIMARY KEY,
147 CREATE TABLE messages(
148 rowid INTEGER PRIMARY KEY ASC,
154 schemaVersion INTEGER,
155 conversationId STRING,
158 hasAttachments INTEGER,
159 hasFileAttachments INTEGER,
160 hasVisualMediaAttachments INTEGER,
162 expirationStartTimestamp INTEGER,
165 messageTimer INTEGER,
166 messageTimerStart INTEGER,
167 messageTimerExpiresAt INTEGER,
170 sourceServiceId TEXT, serverGuid STRING NULL, sourceDevice INTEGER, storyId STRING, isStory INTEGER
171 GENERATED ALWAYS AS (type IS
'story
'), isChangeCreatedByUs INTEGER NOT NULL DEFAULT
0, isTimerChangeFromSync INTEGER
172 GENERATED ALWAYS AS (
173 json_extract(json,
'$.expirationTimerUpdate.fromSync
') IS
1
174 ), seenStatus NUMBER default
0, storyDistributionListId STRING, expiresAt INT
177 expirationStartTimestamp + (expireTimer *
1000),
179 )), shouldAffectActivity INTEGER
180 GENERATED ALWAYS AS (
184 'change-number-notification
',
185 'contact-removed-notification
',
186 'conversation-merge
',
187 'group-v1-migration
',
189 'message-history-unsynced
',
190 'profile-change
',
192 'universal-timer-notification
',
193 'verified-change
'
195 ), shouldAffectPreview INTEGER
196 GENERATED ALWAYS AS (
200 'change-number-notification
',
201 'contact-removed-notification
',
202 'conversation-merge
',
203 'group-v1-migration
',
205 'message-history-unsynced
',
206 'profile-change
',
208 'universal-timer-notification
',
209 'verified-change
'
211 ), isUserInitiatedMessage INTEGER
212 GENERATED ALWAYS AS (
216 'change-number-notification
',
217 'contact-removed-notification
',
218 'conversation-merge
',
219 'group-v1-migration
',
220 'group-v2-change
',
222 'message-history-unsynced
',
223 'profile-change
',
225 'universal-timer-notification
',
226 'verified-change
'
228 ), mentionsMe INTEGER NOT NULL DEFAULT
0, isGroupLeaveEvent INTEGER
229 GENERATED ALWAYS AS (
230 type IS
'group-v2-change
' AND
231 json_array_length(json_extract(json,
'$.groupV2Change.details
')) IS
1 AND
232 json_extract(json,
'$.groupV2Change.details[
0].type
') IS
'member-remove
' AND
233 json_extract(json,
'$.groupV2Change.from
') IS NOT NULL AND
234 json_extract(json,
'$.groupV2Change.from
') IS json_extract(json,
'$.groupV2Change.details[
0].aci
')
235 ), isGroupLeaveEventFromOther INTEGER
236 GENERATED ALWAYS AS (
237 isGroupLeaveEvent IS
1
239 isChangeCreatedByUs IS
0
241 GENERATED ALWAYS AS (
242 json_extract(json,
'$.callId
')
244 CREATE TABLE sqlite_stat4(tbl,idx,neq,nlt,ndlt,sample);
247 queueType TEXT STRING NOT NULL,
248 timestamp INTEGER NOT NULL,
251 CREATE TABLE reactions(
252 conversationId STRING,
255 messageReceivedAt INTEGER,
256 targetAuthorAci STRING,
257 targetTimestamp INTEGER,
260 CREATE TABLE senderKeys(
261 id TEXT PRIMARY KEY NOT NULL,
262 senderId TEXT NOT NULL,
263 distributionId TEXT NOT NULL,
265 lastUpdatedDate NUMBER NOT NULL
267 CREATE TABLE unprocessed(
268 id STRING PRIMARY KEY ASC,
275 serverTimestamp INTEGER,
276 sourceServiceId STRING
277 , serverGuid STRING NULL, sourceDevice INTEGER, receivedAtCounter INTEGER, urgent INTEGER, story INTEGER);
278 CREATE TABLE sendLogPayloads(
279 id INTEGER PRIMARY KEY ASC,
281 timestamp INTEGER NOT NULL,
282 contentHint INTEGER NOT NULL,
284 , urgent INTEGER, hasPniSignatureMessage INTEGER DEFAULT
0 NOT NULL);
285 CREATE TABLE sendLogRecipients(
286 payloadId INTEGER NOT NULL,
288 recipientServiceId STRING NOT NULL,
289 deviceId INTEGER NOT NULL,
291 PRIMARY KEY (payloadId, recipientServiceId, deviceId),
293 CONSTRAINT sendLogRecipientsForeignKey
294 FOREIGN KEY (payloadId)
295 REFERENCES sendLogPayloads(id)
298 CREATE TABLE sendLogMessageIds(
299 payloadId INTEGER NOT NULL,
301 messageId STRING NOT NULL,
303 PRIMARY KEY (payloadId, messageId),
305 CONSTRAINT sendLogMessageIdsForeignKey
306 FOREIGN KEY (payloadId)
307 REFERENCES sendLogPayloads(id)
310 CREATE TABLE preKeys(
311 id STRING PRIMARY KEY ASC,
313 , ourServiceId NUMBER
314 GENERATED ALWAYS AS (json_extract(json,
'$.ourServiceId
')));
315 CREATE TABLE signedPreKeys(
316 id STRING PRIMARY KEY ASC,
318 , ourServiceId NUMBER
319 GENERATED ALWAYS AS (json_extract(json,
'$.ourServiceId
')));
322 category TEXT NOT NULL,
324 descriptionTemplate TEXT NOT NULL
326 CREATE TABLE badgeImageFiles(
327 badgeId TEXT REFERENCES badges(id)
330 'order
' INTEGER NOT NULL,
335 CREATE TABLE storyReads (
336 authorId STRING NOT NULL,
337 conversationId STRING NOT NULL,
338 storyId STRING NOT NULL,
339 storyReadDate NUMBER NOT NULL,
341 PRIMARY KEY (authorId, storyId)
343 CREATE TABLE storyDistributions(
344 id STRING PRIMARY KEY NOT NULL,
347 senderKeyInfoJson STRING
348 , deletedAtTimestamp INTEGER, allowsReplies INTEGER, isBlockList INTEGER, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync INTEGER);
349 CREATE TABLE storyDistributionMembers(
350 listId STRING NOT NULL REFERENCES storyDistributions(id)
353 serviceId STRING NOT NULL,
355 PRIMARY KEY (listId, serviceId)
357 CREATE TABLE uninstalled_sticker_packs (
358 id STRING NOT NULL PRIMARY KEY,
359 uninstalledAt NUMBER NOT NULL,
361 storageVersion NUMBER,
362 storageUnknownFields BLOB,
363 storageNeedsSync INTEGER NOT NULL
365 CREATE TABLE groupCallRingCancellations(
366 ringId INTEGER PRIMARY KEY,
367 createdAt INTEGER NOT NULL
369 CREATE TABLE IF NOT EXISTS
'messages_fts_data
'(id INTEGER PRIMARY KEY, block BLOB);
370 CREATE TABLE IF NOT EXISTS
'messages_fts_idx
'(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID;
371 CREATE TABLE IF NOT EXISTS
'messages_fts_content
'(id INTEGER PRIMARY KEY, c0);
372 CREATE TABLE IF NOT EXISTS
'messages_fts_docsize
'(id INTEGER PRIMARY KEY, sz BLOB);
373 CREATE TABLE IF NOT EXISTS
'messages_fts_config
'(k PRIMARY KEY, v) WITHOUT ROWID;
374 CREATE TABLE edited_messages(
375 messageId STRING REFERENCES messages(id)
379 , conversationId STRING);
380 CREATE TABLE mentions (
381 messageId REFERENCES messages(id) ON DELETE CASCADE,
386 CREATE TABLE kyberPreKeys(
387 id STRING PRIMARY KEY NOT NULL,
388 json TEXT NOT NULL, ourServiceId NUMBER
389 GENERATED ALWAYS AS (json_extract(json,
'$.ourServiceId
')));
390 CREATE TABLE callsHistory (
391 callId TEXT PRIMARY KEY,
392 peerId TEXT NOT NULL, -- conversation id (legacy) | uuid | groupId | roomId
393 ringerId TEXT DEFAULT NULL, -- ringer uuid
394 mode TEXT NOT NULL, -- enum
"Direct
" |
"Group
"
395 type TEXT NOT NULL, -- enum
"Audio
" |
"Video
" |
"Group
"
396 direction TEXT NOT NULL, -- enum
"Incoming
" |
"Outgoing
397 -- Direct: enum
"Pending
" |
"Missed
" |
"Accepted
" |
"Deleted
"
398 -- Group: enum
"GenericGroupCall
" |
"OutgoingRing
" |
"Ringing
" |
"Joined
" |
"Missed
" |
"Declined
" |
"Accepted
" |
"Deleted
"
399 status TEXT NOT NULL,
400 timestamp INTEGER NOT NULL,
401 UNIQUE (callId, peerId) ON CONFLICT FAIL
403 [ dropped all indexes to save space in this blog post ]
404 CREATE TRIGGER messages_on_view_once_update AFTER UPDATE ON messages
406 new.body IS NOT NULL AND new.isViewOnce =
1
408 DELETE FROM messages_fts WHERE rowid = old.rowid;
410 CREATE TRIGGER messages_on_insert AFTER INSERT ON messages
411 WHEN new.isViewOnce IS NOT
1 AND new.storyId IS NULL
413 INSERT INTO messages_fts
416 (new.rowid, new.body);
418 CREATE TRIGGER messages_on_delete AFTER DELETE ON messages BEGIN
419 DELETE FROM messages_fts WHERE rowid = old.rowid;
420 DELETE FROM sendLogPayloads WHERE id IN (
421 SELECT payloadId FROM sendLogMessageIds
422 WHERE messageId = old.id
424 DELETE FROM reactions WHERE rowid IN (
425 SELECT rowid FROM reactions
426 WHERE messageId = old.id
428 DELETE FROM storyReads WHERE storyId = old.storyId;
430 CREATE VIRTUAL TABLE messages_fts USING fts5(
432 tokenize =
'signal_tokenizer
'
434 CREATE TRIGGER messages_on_update AFTER UPDATE ON messages
436 (new.body IS NULL OR old.body IS NOT new.body) AND
437 new.isViewOnce IS NOT
1 AND new.storyId IS NULL
439 DELETE FROM messages_fts WHERE rowid = old.rowid;
440 INSERT INTO messages_fts
443 (new.rowid, new.body);
445 CREATE TRIGGER messages_on_insert_insert_mentions AFTER INSERT ON messages
447 INSERT INTO mentions (messageId, mentionAci, start, length)
449 SELECT messages.id, bodyRanges.value -
>> 'mentionAci
' as mentionAci,
450 bodyRanges.value -
>> 'start
' as start,
451 bodyRanges.value -
>> 'length
' as length
452 FROM messages, json_each(messages.json -
>> 'bodyRanges
') as bodyRanges
453 WHERE bodyRanges.value -
>> 'mentionAci
' IS NOT NULL
455 AND messages.id = new.id;
457 CREATE TRIGGER messages_on_update_update_mentions AFTER UPDATE ON messages
459 DELETE FROM mentions WHERE messageId = new.id;
460 INSERT INTO mentions (messageId, mentionAci, start, length)
462 SELECT messages.id, bodyRanges.value -
>> 'mentionAci
' as mentionAci,
463 bodyRanges.value -
>> 'start
' as start,
464 bodyRanges.value -
>> 'length
' as length
465 FROM messages, json_each(messages.json -
>> 'bodyRanges
') as bodyRanges
466 WHERE bodyRanges.value -
>> 'mentionAci
' IS NOT NULL
468 AND messages.id = new.id;
473 <p
>Finally I have the tool needed to inspect and process Signal
474 messages that I need, without using the vendor provided client. Now
475 on to transforming it to a more useful format.
</p
>
477 <p
>As usual, if you use Bitcoin and want to show your support of my
478 activities, please send Bitcoin donations to my address
479 <b
><a href=
"bitcoin:
15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
">15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
</a
></b
>.
</p
>
484 <title>New chrpath release
0.17</title>
485 <link>https://people.skolelinux.org/pere/blog/New_chrpath_release_0_17.html
</link>
486 <guid isPermaLink=
"true">https://people.skolelinux.org/pere/blog/New_chrpath_release_0_17.html
</guid>
487 <pubDate>Fri,
10 Nov
2023 07:
30:
00 +
0100</pubDate>
488 <description><p
>The chrpath package provide a simple command line tool to remove or
489 modify the rpath or runpath of compiled ELF program. It is almost
10
490 years since I updated the code base, but I stumbled over the tool
491 today, and decided it was time to move the code base from Subversion
492 to git and find a new home for it, as the previous one (Debian Alioth)
493 has been shut down. I decided to go with
494 <a href=
"https://codeberg.org/
">Codeberg
</a
> this time, as it is my git
495 service of choice these days, did a quick and dirty migration to git
496 and updated the code with a few patches I found in the Debian bug
497 tracker. These are the release notes:
</p
>
499 <p
>New in
0.17 released
2023-
11-
10:
</p
>
502 <li
>Moved project to Codeberg, as Alioth is shut down.
</li
>
503 <li
>Add Solaris support (use
&lt;sys/byteorder.h
> instead of
&lt;byteswap.h
>).
504 Patch from Rainer Orth.
</li
>
505 <li
>Added missing newline from printf() line. Patch from Frank Dana.
</li
>
506 <li
>Corrected handling of multiple ELF sections. Patch from Frank Dana.
</li
>
507 <li
>Updated build rules for .deb. Partly based on patch from djcj.
</li
>
510 <p
>The latest edition is tagged and available from
511 <a href=
"https://codeberg.org/pere/chrpath
">https://codeberg.org/pere/chrpath
</a
>.
513 <p
>As usual, if you use Bitcoin and want to show your support of my
514 activities, please send Bitcoin donations to my address
515 <b
><a href=
"bitcoin:
15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
">15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
</a
></b
>.
</p
>
520 <title>Test framework for DocBook processors / formatters
</title>
521 <link>https://people.skolelinux.org/pere/blog/Test_framework_for_DocBook_processors___formatters.html
</link>
522 <guid isPermaLink=
"true">https://people.skolelinux.org/pere/blog/Test_framework_for_DocBook_processors___formatters.html
</guid>
523 <pubDate>Sun,
5 Nov
2023 13:
00:
00 +
0100</pubDate>
524 <description><p
>All the books I have published so far has been using
525 <a href=
"https://docbook.org/
">DocBook
</a
> somewhere in the process.
526 For the first book, the source format was DocBook, while for every
527 later book it was an intermediate format used as the stepping stone to
528 be able to present the same manuscript in several formats, on paper,
529 as ebook in ePub format, as a HTML page and as a PDF file either for
530 paper production or for Internet consumption. This is made possible
531 with a wide variety of free software tools with DocBook support in
532 Debian. The source format of later books have been docx via rst,
533 Markdown, Filemaker and Asciidoc, and for all of these I was able to
534 generate a suitable DocBook file for further processing using
535 <a href=
"https://tracker.debian.org/pkg/pandoc
">pandoc
</a
>,
536 <a href=
"https://tracker.debian.org/pkg/asciidoc
">a2x
</a
> and
537 <a href=
"https://tracker.debian.org/pkg/asciidoctor
">asciidoctor
</a
>,
538 as well as rendering using
539 <a href=
"https://tracker.debian.org/pkg/xmlto
">xmlto
</a
>,
540 <a href=
"https://tracker.debian.org/pkg/dbtoepub
">dbtoepub
</a
>,
541 <a href=
"https://tracker.debian.org/pkg/dblatex
">dblatex
</a
>,
542 <a href=
"https://tracker.debian.org/pkg/docbook-xsl
">docbook-xsl
</a
> and
543 <a href=
"https://tracker.debian.org/pkg/fop
">fop
</a
>.
</p
>
545 <p
>Most of the
<a href=
"http://www.hungry.com/~pere/publisher/
">books I
546 have published
</a
> are translated books, with English as the source
548 <a href=
"https://tracker.debian.org/pkg/po4a
">po4a
</a
> to
549 handle translations using the gettext PO format has been a blessing,
550 but publishing translated books had triggered the need to ensure the
551 DocBook tools handle relevant languages correctly. For every new
552 language I have published, I had to submit patches dblatex, dbtoepub
553 and docbook-xsl fixing incorrect language and country specific issues
554 in the framework themselves. Typically this has been missing keywords
555 like
'figure
' or sort ordering of index entries. After a while it
556 became tiresome to only discover issues like this by accident, and I
557 decided to write a DocBook
"test framework
" exercising various
558 features of DocBook and allowing me to see all features exercised for
559 a given language. It consist of a set of DocBook files, a version
4
560 book, a version
5 book, a v4 book set, a v4 selection of problematic
561 tables, one v4 testing sidefloat and finally one v4 testing a book of
562 articles. The DocBook files are accompanied with a set of build rules
563 for building PDF using dblatex and docbook-xsl/fop, HTML using xmlto
564 or docbook-xsl and epub using dbtoepub. The result is a set of files
565 visualizing footnotes, indexes, table of content list, figures,
566 formulas and other DocBook features, allowing for a quick review on
567 the completeness of the given locale settings. To build with a
568 different language setting, all one need to do is edit the lang= value
569 in the .xml file to pick a different ISO
639 code value and run
570 'make
'.
</p
>
572 <p
>The
<a href=
"https://codeberg.org/pere/docbook-example/
">test framework
573 source code
</a
> is available from Codeberg, and a generated set of
574 presentations of the various examples is available as Codeberg static
576 <a href=
"https://pere.codeberg.page/docbook-example/
">https://pere.codeberg.page/docbook-example/
</a
>.
577 Using this test framework I have been able to discover and report
578 several bugs and missing features in various tools, and got a lot of
579 them fixed. For example I got Northern Sami keywords added to both
580 docbook-xsl and dblatex, fixed several typos in Norwegian bokmål and
581 Norwegian Nynorsk, support for non-ascii title IDs added to pandoc,
582 Norwegian index sorting support fixed in xindy and initial Norwegian
583 Bokmål support added to dblatex. Some issues still remains, though.
584 Default index sorting rules are still broken in several tools, so the
585 Norwegian letters æ, ø and å are more often than not sorted properly
586 in the book index.
</p
>
588 <p
>The test framework recently received some more polish, as part of
589 publishing my latest book. This book contained a lot of fairly
590 complex tables, which exposed bugs in some of the tools. This made me
591 add a new test file with various tables, as well as spend some time to
592 brush up the build rules. My goal is for the test framework to
593 exercise all DocBook features to make it easier to see which features
594 work with different processors, and hopefully get them all to support
595 the full set of DocBook features. Feel free to send patches to extend
596 the test set, and test it with your favorite DocBook processor.
597 Please visit these two URLs to learn more:
</p
>
600 <li
><a href=
"https://codeberg.org/pere/docbook-example/
">https://codeberg.org/pere/docbook-example/
</a
></li
>
601 <li
><a href=
"https://pere.codeberg.page/docbook-example/
">https://pere.codeberg.page/docbook-example/
</a
></li
>
604 <p
>If you want to learn more on Docbook and translations, I recommend
605 having a look at the
<a href=
"https://docbook.org/
">the DocBook
607 <a href=
"https://doccookbook.sourceforge.net/html/en/
">the DoCookBook
608 site
<a/
> and my earlier blog post on
609 <a href=
"https://people.skolelinux.org/pere/blog/From_English_wiki_to_translated_PDF_and_epub_via_Docbook.html
">how
610 the Skolelinux project process and translate documentation
</a
>, a talk I gave earlier this year on
611 <a href=
"https://www.nuug.no/aktiviteter/
20230314-oversetting-og-publisering-av-b%c3%b8ker-med-fri-programvare/
">how
612 to translate and publish books using free software
</a
> (Norwegian
617 https://github.com/docbook/xslt10-stylesheets/issues/
205 (docbook-xsl: sme support)
618 https://bugs.debian.org/
968437 (xindy: index sorting rules for nb/nn)
619 https://bugs.debian.org/
856123 (pandoc: markdown to docbook with non-english titles)
620 https://bugs.debian.org/
864813 (dblatex: missing nb words)
621 https://bugs.debian.org/
756386 (dblatex: index sorting rules for nb/nn)
622 https://bugs.debian.org/
796871 (dbtoepub: index sorting rules for nb/nn)
623 https://bugs.debian.org/
792616 (dblatex: PDF metadata)
624 https://bugs.debian.org/
686908 (docbook-xsl: index sorting rules for nb/nn)
625 https://sourceforge.net/tracker/?func=detail
&atid=
373747&aid=
3556630&group_id=
21935 (docbook-xsl: nb/nn support)
626 https://bugs.debian.org/
684391 (dblatex: initial nb support)
630 <p
>As usual, if you use Bitcoin and want to show your support of my
631 activities, please send Bitcoin donations to my address
632 <b
><a href=
"bitcoin:
15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
">15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b
</a
></b
>.
</p
>