This is a copy of
-an
-email I posted to the nikita-noark mailing list. Please follow up
-there if you would like to discuss this topic. The background is that
-we are making a free software archive system based on the Norwegian
-Noark
-5 standard for government archives.
-
-
I've been wondering a bit lately how trusted timestamps could be
-stored in Noark 5.
-Trusted
-timestamps can be used to verify that some information
-(document/file/checksum/metadata) have not been changed since a
-specific time in the past. This is useful to verify the integrity of
-the documents in the archive.
-
-
Then it occured to me, perhaps the trusted timestamps could be
-stored as dokument variants (ie dokumentobjekt referered to from
-dokumentbeskrivelse) with the filename set to the hash it is
-stamping?
-
-
Given a "dokumentbeskrivelse" with an associated "dokumentobjekt",
-a new dokumentobjekt is associated with "dokumentbeskrivelse" with the
-same attributes as the stamped dokumentobjekt except these
-attributes:
-
-
-
-- format -> "RFC3161"
-
- mimeType -> "application/timestamp-reply"
-
- formatDetaljer -> "<source URL for timestamp service>"
-
- filenavn -> "<sjekksum>.tsr"
-
-
-
-
This assume a service following
-IETF RFC 3161 is
-used, which specifiy the given MIME type for replies and the .tsr file
-ending for the content of such trusted timestamp. As far as I can
-tell from the Noark 5 specifications, it is OK to have several
-variants/renderings of a dokument attached to a given
-dokumentbeskrivelse objekt. It might be stretching it a bit to make
-some of these variants represent crypto-signatures useful for
-verifying the document integrity instead of representing the dokument
-itself.
-
-
Using the source of the service in formatDetaljer allow several
-timestamping services to be used. This is useful to spread the risk
-of key compromise over several organisations. It would only be a
-problem to trust the timestamps if all of the organisations are
-compromised.
-
-
The following oneliner on Linux can be used to generate the tsr
-file. $input is the path to the file to checksum, and $sha256 is the
-SHA-256 checksum of the file (ie the ".tsr" value mentioned
-above).
-
-
-openssl ts -query -data "$inputfile" -cert -sha256 -no_nonce \
- | curl -s -H "Content-Type: application/timestamp-query" \
- --data-binary "@-" http://zeitstempel.dfn.de > $sha256.tsr
-
-
-
To verify the timestamp, you first need to download the public key
-of the trusted timestamp service, for example using this command:
-
-
-wget -O ca-cert.txt \
- https://pki.pca.dfn.de/global-services-ca/pub/cacert/chain.txt
-
-
-
Note, the public key should be stored alongside the timestamps in
-the archive to make sure it is also available 100 years from now. It
-is probably a good idea to standardise how and were to store such
-public keys, to make it easier to find for those trying to verify
-documents 100 or 1000 years from now. :)
-
-
The verification itself is a simple openssl command:
-
-
-openssl ts -verify -data $inputfile -in $sha256.tsr \
- -CAfile ca-cert.txt -text
-
-
-
Is there any reason this approach would not work? Is it somehow against
-the Noark 5 specification?
+
+
8th October 2018
+
I have earlier covered the basics of trusted timestamping using the
+'openssl ts' client. See blog post for
+2014,
+2016
+and
+2017
+for those stories. But some times I want to integrate the timestamping
+in other code, and recently I needed to integrate it into Python.
+After searching a bit, I found
+the
+rfc3161 library which seemed like a good fit, but I soon
+discovered it only worked for python version 2, and I needed something
+that work with python version 3. Luckily I next came across
+the rfc3161ng library,
+a fork of the original rfc3161 library. Not only is it working with
+python 3, it have fixed a few of the bugs in the original library, and
+it has an active maintainer. I decided to wrap it up and make it
+available in
+Debian, and a few days ago it entered Debian unstable and testing.
+
+
Using the library is fairly straight forward. The only slightly
+problematic step is to fetch the required certificates to verify the
+timestamp. For some services it is straight forward, while for others
+I have not yet figured out how to do it. Here is a small standalone
+code example based on of the integration tests in the library code:
+
+
+#!/usr/bin/python3
+
+"""
+
+Python 3 script demonstrating how to use the rfc3161ng module to
+get trusted timestamps.
+
+The license of this code is the same as the license of the rfc3161ng
+library, ie MIT/BSD.
+
+"""
+
+import os
+import pyasn1.codec.der
+import rfc3161ng
+import subprocess
+import tempfile
+import urllib.request
+
+def store(f, data):
+ f.write(data)
+ f.flush()
+ f.seek(0)
+
+def fetch(url, f=None):
+ response = urllib.request.urlopen(url)
+ data = response.read()
+ if f:
+ store(f, data)
+ return data
+
+def main():
+ with tempfile.NamedTemporaryFile() as cert_f,\
+ tempfile.NamedTemporaryFile() as ca_f,\
+ tempfile.NamedTemporaryFile() as msg_f,\
+ tempfile.NamedTemporaryFile() as tsr_f:
+
+ # First fetch certificates used by service
+ certificate_data = fetch('https://freetsa.org/files/tsa.crt', cert_f)
+ ca_data_data = fetch('https://freetsa.org/files/cacert.pem', ca_f)
+
+ # Then timestamp the message
+ timestamper = \
+ rfc3161ng.RemoteTimestamper('http://freetsa.org/tsr',
+ certificate=certificate_data)
+ data = b"Python forever!\n"
+ tsr = timestamper(data=data, return_tsr=True)
+
+ # Finally, convert message and response to something 'openssl ts' can verify
+ store(msg_f, data)
+ store(tsr_f, pyasn1.codec.der.encoder.encode(tsr))
+ args = ["openssl", "ts", "-verify",
+ "-data", msg_f.name,
+ "-in", tsr_f.name,
+ "-CAfile", ca_f.name,
+ "-untrusted", cert_f.name]
+ subprocess.check_call(args)
+
+if '__main__' == __name__:
+ main()
+
+
+
The code fetches the required certificates, store them as temporary
+files, timestamp a simple message, store the message and timestamp to
+disk and ask 'openssl ts' to verify the timestamp. A timestamp is
+around 1.5 kiB in size, and should be fairly easy to store for future
+use.
+
+
As usual, if you use Bitcoin and want to show your support of my
+activities, please send Bitcoin donations to my address
+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
@@ -336,61 +404,68 @@ the Noark 5 specification?
-
-
3rd June 2017
-
Aftenposten
-melder i dag om feil i eksamensoppgavene for eksamen i politikk og
-menneskerettigheter, der teksten i bokmåls og nynorskutgaven ikke var
-like. Oppgaveteksten er gjengitt i artikkelen, og jeg ble nysgjerring
-på om den fri oversetterløsningen
-Apertium ville gjort en bedre
-jobb enn Utdanningsdirektoratet. Det kan se slik ut.
-
-
Her er bokmålsoppgaven fra eksamenen:
-
-
-Drøft utfordringene knyttet til nasjonalstatenes og andre aktørers
-rolle og muligheter til å håndtere internasjonale utfordringer, som
-for eksempel flykningekrisen.
-
-Vedlegge er eksempler på tekster som kan gi relevante perspektiver
-på temaet:
-
-- Flykningeregnskapet 2016, UNHCR og IDMC
-
- «Grenseløst Europa for fall» A-Magasinet, 26. november 2015
-
-
-
-
-
Dette oversetter Apertium slik:
-
-
-Drøft utfordringane knytte til nasjonalstatane sine og rolla til
-andre aktørar og høve til å handtera internasjonale utfordringar, som
-til dømes *flykningekrisen.
-
-Vedleggja er døme på tekster som kan gje relevante perspektiv på
-temaet:
-
-
-- *Flykningeregnskapet 2016, *UNHCR og *IDMC
-- «*Grenseløst Europa for fall» A-Magasinet, 26. november 2015
-
-
-
-
-
Ord som ikke ble forstått er markert med stjerne (*), og trenger
-ekstra språksjekk. Men ingen ord er forsvunnet, slik det var i
-oppgaven elevene fikk presentert på eksamen. Jeg mistenker dog at
-"andre aktørers rolle og muligheter til ..." burde vært oversatt til
-"rolla til andre aktørar og deira høve til ..." eller noe slikt, men
-det er kanskje flisespikking. Det understreker vel bare at det alltid
-trengs korrekturlesning etter automatisk oversettelse.
+
+
4th October 2018
+
A few days, I rescued a Windows victim over to Debian. To try to
+rescue the remains, I helped set up automatic sync with Google Drive.
+I did not find any sensible Debian package handling this
+automatically, so I rebuild the grive2 source from
+the Ubuntu UPD8 PPA to do the
+task and added a autostart desktop entry and a small shell script to
+run in the background while the user is logged in to do the sync.
+Here is a sketch of the setup for future reference.
+
+
I first created ~/googledrive, entered the directory and
+ran 'grive -a' to authenticate the machine/user. Next, I
+created a autostart hook in ~/.config/autostart/grive.desktop
+to start the sync when the user log in:
+
+
+[Desktop Entry]
+Name=Google drive autosync
+Type=Application
+Exec=/home/user/bin/grive-sync
+
+
+
Finally, I wrote the ~/bin/grive-sync script to sync
+~/googledrive/ with the files in Google Drive.
+
+
+#!/bin/sh
+set -e
+cd ~/
+cleanup() {
+ if [ "$syncpid" ] ; then
+ kill $syncpid
+ fi
+}
+trap cleanup EXIT INT QUIT
+/usr/lib/grive/grive-sync.sh listen googledrive 2>&1 | sed "s%^%$0:%" &
+syncpdi=$!
+while true; do
+ if ! xhost >/dev/null 2>&1 ; then
+ echo "no DISPLAY, exiting as the user probably logged out"
+ exit 1
+ fi
+ if [ ! -e /run/user/1000/grive-sync.sh_googledrive ] ; then
+ /usr/lib/grive/grive-sync.sh sync googledrive
+ fi
+ sleep 300
+done 2>&1 | sed "s%^%$0:%"
+
+
+
Feel free to use the setup if you want. It can be assumed to be
+GNU GPL v2 licensed (or any later version, at your leisure), but I
+doubt this code is possible to claim copyright on.
+
+
As usual, if you use Bitcoin and want to show your support of my
+activities, please send Bitcoin donations to my address
+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
@@ -398,67 +473,161 @@ trengs korrekturlesning etter automatisk oversettelse.
-
-
27th April 2017
-
I disse dager, med frist 1. mai, har Riksarkivaren ute en høring på
-sin forskrift. Som en kan se er det ikke mye tid igjen før fristen
-som går ut på søndag. Denne forskriften er det som lister opp hvilke
-formater det er greit å arkivere i
-Noark
-5-løsninger i Norge.
-
-
Jeg fant høringsdokumentene hos
-Norsk
-Arkivråd etter å ha blitt tipset på epostlisten til
-fri
-programvareprosjektet Nikita Noark5-Core, som lager et Noark 5
-Tjenestegresesnitt. Jeg er involvert i Nikita-prosjektet og takket
-være min interesse for tjenestegrensesnittsprosjektet har jeg lest en
-god del Noark 5-relaterte dokumenter, og til min overraskelse oppdaget
-at standard epost ikke er på listen over godkjente formater som kan
-arkiveres. Høringen med frist søndag er en glimrende mulighet til å
-forsøke å gjøre noe med det. Jeg holder på med
-egen
-høringsuttalelse, og lurer på om andre er interessert i å støtte
-forslaget om å tillate arkivering av epost som epost i arkivet.
-
-
Er du igang med å skrive egen høringsuttalelse allerede? I så fall
-kan du jo vurdere å ta med en formulering om epost-lagring. Jeg tror
-ikke det trengs så mye. Her et kort forslag til tekst:
-
-
-
- Viser til høring sendt ut 2017-02-17 (Riksarkivarens referanse
- 2016/9840 HELHJO), og tillater oss å sende inn noen innspill om
- revisjon av Forskrift om utfyllende tekniske og arkivfaglige
- bestemmelser om behandling av offentlige arkiver (Riksarkivarens
- forskrift).
-
- Svært mye av vår kommuikasjon foregår i dag på e-post. Vi
- foreslår derfor at Internett-e-post, slik det er beskrevet i IETF
- RFC 5322,
- https://tools.ietf.org/html/rfc5322. bør
- inn som godkjent dokumentformat. Vi foreslår at forskriftens
- oversikt over godkjente dokumentformater ved innlevering i § 5-16
- endres til å ta med Internett-e-post.
-
-
-
-
Som del av arbeidet med tjenestegrensesnitt har vi testet hvordan
-epost kan lagres i en Noark 5-struktur, og holder på å skrive et
-forslag om hvordan dette kan gjøres som vil bli sendt over til
-arkivverket så snart det er ferdig. De som er interesserte kan
-følge
-fremdriften på web.
-
-
Oppdatering 2017-04-28: I dag ble høringuttalelsen jeg skrev
- sendt
- inn av foreningen NUUG.
+
+
29th September 2018
+
It would come as no surprise to anyone that I am interested in
+bitcoins and virtual currencies. I've been keeping an eye on virtual
+currencies for many years, and it is part of the reason a few months
+ago, I started writing a python library for collecting currency
+exchange rates and trade on virtual currency exchanges. I decided to
+name the end result valutakrambod, which perhaps can be translated to
+small currency shop.
+
+
The library uses the tornado python library to handle HTTP and
+websocket connections, and provide a asynchronous system for
+connecting to and tracking several services. The code is available
+from
+github.
+
+There are two example clients of the library. One is very simple and
+list every updated buy/sell price received from the various services.
+This code is started by running bin/btc-rates and call the client code
+in valutakrambod/client.py. The simple client look like this:
+
+
+import functools
+import tornado.ioloop
+import valutakrambod
+class SimpleClient(object):
+ def __init__(self):
+ self.services = []
+ self.streams = []
+ pass
+ def newdata(self, service, pair, changed):
+ print("%-15s %s-%s: %8.3f %8.3f" % (
+ service.servicename(),
+ pair[0],
+ pair[1],
+ service.rates[pair]['ask'],
+ service.rates[pair]['bid'])
+ )
+ async def refresh(self, service):
+ await service.fetchRates(service.wantedpairs)
+ def run(self):
+ self.ioloop = tornado.ioloop.IOLoop.current()
+ self.services = valutakrambod.service.knownServices()
+ for e in self.services:
+ service = e()
+ service.subscribe(self.newdata)
+ stream = service.websocket()
+ if stream:
+ self.streams.append(stream)
+ else:
+ # Fetch information from non-streaming services immediately
+ self.ioloop.call_later(len(self.services),
+ functools.partial(self.refresh, service))
+ # as well as regularly
+ service.periodicUpdate(60)
+ for stream in self.streams:
+ stream.connect()
+ try:
+ self.ioloop.start()
+ except KeyboardInterrupt:
+ print("Interrupted by keyboard, closing all connections.")
+ pass
+ for stream in self.streams:
+ stream.close()
+
+
+
The library client loops over all known "public" services,
+initialises it, subscribes to any updates from the service, checks and
+activates websocket streaming if the service provide it, and if no
+streaming is supported, fetches information from the service and sets
+up a periodic update every 60 seconds. The output from this client
+can look like this:
+
+
+Bl3p BTC-EUR: 5687.110 5653.690
+Bl3p BTC-EUR: 5687.110 5653.690
+Bl3p BTC-EUR: 5687.110 5653.690
+Hitbtc BTC-USD: 6594.560 6593.690
+Hitbtc BTC-USD: 6594.560 6593.690
+Bl3p BTC-EUR: 5687.110 5653.690
+Hitbtc BTC-USD: 6594.570 6593.690
+Bitstamp EUR-USD: 1.159 1.154
+Hitbtc BTC-USD: 6594.570 6593.690
+Hitbtc BTC-USD: 6594.580 6593.690
+Hitbtc BTC-USD: 6594.580 6593.690
+Hitbtc BTC-USD: 6594.580 6593.690
+Bl3p BTC-EUR: 5687.110 5653.690
+Paymium BTC-EUR: 5680.000 5620.240
+
+
+
The exchange order book is tracked in addition to the best buy/sell
+price, for those that need to know the details.
+
+
The other example client is focusing on providing a curses view
+with updated buy/sell prices as soon as they are received from the
+services. This code is located in bin/btc-rates-curses and activated
+by using the '-c' argument. Without the argument the "curses" output
+is printed without using curses, which is useful for debugging. The
+curses view look like this:
+
+
+ Name Pair Bid Ask Spr Ftcd Age
+ BitcoinsNorway BTCEUR 5591.8400 5711.0800 2.1% 16 nan 60
+ Bitfinex BTCEUR 5671.0000 5671.2000 0.0% 16 22 59
+ Bitmynt BTCEUR 5580.8000 5807.5200 3.9% 16 41 60
+ Bitpay BTCEUR 5663.2700 nan nan% 15 nan 60
+ Bitstamp BTCEUR 5664.8400 5676.5300 0.2% 0 1 1
+ Bl3p BTCEUR 5653.6900 5684.9400 0.5% 0 nan 19
+ Coinbase BTCEUR 5600.8200 5714.9000 2.0% 15 nan nan
+ Kraken BTCEUR 5670.1000 5670.2000 0.0% 14 17 60
+ Paymium BTCEUR 5620.0600 5680.0000 1.1% 1 7515 nan
+ BitcoinsNorway BTCNOK 52898.9700 54034.6100 2.1% 16 nan 60
+ Bitmynt BTCNOK 52960.3200 54031.1900 2.0% 16 41 60
+ Bitpay BTCNOK 53477.7833 nan nan% 16 nan 60
+ Coinbase BTCNOK 52990.3500 54063.0600 2.0% 15 nan nan
+ MiraiEx BTCNOK 52856.5300 54100.6000 2.3% 16 nan nan
+ BitcoinsNorway BTCUSD 6495.5300 6631.5400 2.1% 16 nan 60
+ Bitfinex BTCUSD 6590.6000 6590.7000 0.0% 16 23 57
+ Bitpay BTCUSD 6564.1300 nan nan% 15 nan 60
+ Bitstamp BTCUSD 6561.1400 6565.6200 0.1% 0 2 1
+ Coinbase BTCUSD 6504.0600 6635.9700 2.0% 14 nan 117
+ Gemini BTCUSD 6567.1300 6573.0700 0.1% 16 89 nan
+ Hitbtc+BTCUSD 6592.6200 6594.2100 0.0% 0 0 0
+ Kraken BTCUSD 6565.2000 6570.9000 0.1% 15 17 58
+ Exchangerates EURNOK 9.4665 9.4665 0.0% 16 107789 nan
+ Norgesbank EURNOK 9.4665 9.4665 0.0% 16 107789 nan
+ Bitstamp EURUSD 1.1537 1.1593 0.5% 4 5 1
+ Exchangerates EURUSD 1.1576 1.1576 0.0% 16 107789 nan
+ BitcoinsNorway LTCEUR 1.0000 49.0000 98.0% 16 nan nan
+ BitcoinsNorway LTCNOK 492.4800 503.7500 2.2% 16 nan 60
+ BitcoinsNorway LTCUSD 1.0221 49.0000 97.9% 15 nan nan
+ Norgesbank USDNOK 8.1777 8.1777 0.0% 16 107789 nan
+
+
+
The code for this client is too complex for a simple blog post, so
+you will have to check out the git repository to figure out how it
+work. What I can tell is how the three last numbers on each line
+should be interpreted. The first is how many seconds ago information
+was received from the service. The second is how long ago, according
+to the service, the provided information was updated. The last is an
+estimate on how often the buy/sell values change.
+
+
If you find this library useful, or would like to improve it, I
+would love to hear from you. Note that for some of the services I've
+implemented a trading API. It might be the topic of a future blog
+post.
+
+
As usual, if you use Bitcoin and want to show your support of my
+activities, please send Bitcoin donations to my address
+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
@@ -466,52 +635,47 @@ fremdriften på web.
-
-
20th April 2017
-
Jeg oppdaget i dag at nettstedet som
-publiserer offentlige postjournaler fra statlige etater, OEP, har
-begynt å blokkerer enkelte typer webklienter fra å få tilgang. Vet
-ikke hvor mange det gjelder, men det gjelder i hvert fall libwww-perl
-og curl. For å teste selv, kjør følgende:
-
-
-% curl -v -s https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
-< HTTP/1.1 404 Not Found
-% curl -v -s --header 'User-Agent:Opera/12.0' https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
-< HTTP/1.1 200 OK
-%
-
-
-
Her kan en se at tjenesten gir «404 Not Found» for curl i
-standardoppsettet, mens den gir «200 OK» hvis curl hevder å være Opera
-versjon 12.0. Offentlig elektronisk postjournal startet blokkeringen
-2017-03-02.
-
-
Blokkeringen vil gjøre det litt vanskeligere å maskinelt hente
-informasjon fra oep.no. Kan blokkeringen være gjort for å hindre
-automatisert innsamling av informasjon fra OEP, slik Pressens
-Offentlighetsutvalg gjorde for å dokumentere hvordan departementene
-hindrer innsyn i
-rapporten
-«Slik hindrer departementer innsyn» som ble publiserte i januar
-2017. Det virker usannsynlig, da det jo er trivielt å bytte
-User-Agent til noe nytt.
-
-
Finnes det juridisk grunnlag for det offentlige å diskriminere
-webklienter slik det gjøres her? Der tilgang gis eller ikke alt etter
-hva klienten sier at den heter? Da OEP eies av DIFI og driftes av
-Basefarm, finnes det kanskje noen dokumenter sendt mellom disse to
-aktørene man kan be om innsyn i for å forstå hva som har skjedd. Men
-postjournalen
-til DIFI viser kun to dokumenter det siste året mellom DIFI og
-Basefarm.
-Mimes brønn neste,
-tenker jeg.
+
+
24th September 2018
+
Back in February, I got curious to see
+if
+VLC now supported Bittorrent streaming. It did not, despite the
+fact that the idea and code to handle such streaming had been floating
+around for years. I did however find
+a standalone plugin
+for VLC to do it, and half a year later I decided to wrap up the
+plugin and get it into Debian. I uploaded it to NEW a few days ago,
+and am very happy to report that it
+entered
+Debian a few hours ago, and should be available in Debian/Unstable
+tomorrow, and Debian/Testing in a few days.
+
+
With the vlc-plugin-bittorrent package installed you should be able
+to stream videos using a simple call to
+
+
+vlc https://archive.org/download/TheGoat/TheGoat_archive.torrent
+
+
+It can handle magnet links too. Now if only native vlc had
+bittorrent support. Then a lot more would be helping each other to
+share public domain and creative commons movies. The plugin need some
+stability work with seeking and picking the right file in a torrent
+with many files, but is already usable. Please note that the plugin
+is not removing downloaded files when vlc is stopped, so it can fill
+up your disk if you are not careful. Have fun. :)
+
+
I would love to get help maintaining this package. Get in touch if
+you are interested.
+
+
As usual, if you use Bitcoin and want to show your support of my
+activities, please send Bitcoin donations to my address
+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
@@ -519,101 +683,35 @@ tenker jeg.
-
-
19th March 2017
-
The Nikita
-Noark 5 core project is implementing the Norwegian standard for
-keeping an electronic archive of government documents.
-The
-Noark 5 standard document the requirement for data systems used by
-the archives in the Norwegian government, and the Noark 5 web interface
-specification document a REST web service for storing, searching and
-retrieving documents and metadata in such archive. I've been involved
-in the project since a few weeks before Christmas, when the Norwegian
-Unix User Group
-announced
-it supported the project. I believe this is an important project,
-and hope it can make it possible for the government archives in the
-future to use free software to keep the archives we citizens depend
-on. But as I do not hold such archive myself, personally my first use
-case is to store and analyse public mail journal metadata published
-from the government. I find it useful to have a clear use case in
-mind when developing, to make sure the system scratches one of my
-itches.
-
-
If you would like to help make sure there is a free software
-alternatives for the archives, please join our IRC channel
-(#nikita on
-irc.freenode.net) and
-the
-project mailing list.
-
-
When I got involved, the web service could store metadata about
-documents. But a few weeks ago, a new milestone was reached when it
-became possible to store full text documents too. Yesterday, I
-completed an implementation of a command line tool
-archive-pdf to upload a PDF file to the archive using this
-API. The tool is very simple at the moment, and find existing
-fonds, series and
-files while asking the user to select which one to use if more than
-one exist. Once a file is identified, the PDF is associated with the
-file and uploaded, using the title extracted from the PDF itself. The
-process is fairly similar to visiting the archive, opening a cabinet,
-locating a file and storing a piece of paper in the archive. Here is
-a test run directly after populating the database with test data using
-our API tester:
-
-
-~/src//noark5-tester$ ./archive-pdf mangelmelding/mangler.pdf
-using arkiv: Title of the test fonds created 2017-03-18T23:49:32.103446
-using arkivdel: Title of the test series created 2017-03-18T23:49:32.103446
-
- 0 - Title of the test case file created 2017-03-18T23:49:32.103446
- 1 - Title of the test file created 2017-03-18T23:49:32.103446
-Select which mappe you want (or search term): 0
-Uploading mangelmelding/mangler.pdf
- PDF title: Mangler i spesifikasjonsdokumentet for NOARK 5 Tjenestegrensesnitt
- File 2017/1: Title of the test case file created 2017-03-18T23:49:32.103446
-~/src//noark5-tester$
-
-
-
You can see here how the fonds (arkiv) and serie (arkivdel) only had
-one option, while the user need to choose which file (mappe) to use
-among the two created by the API tester. The archive-pdf
-tool can be found in the git repository for the API tester.
-
-
In the project, I have been mostly working on
-the API
-tester so far, while getting to know the code base. The API
-tester currently use
-the HATEOAS links
-to traverse the entire exposed service API and verify that the exposed
-operations and objects match the specification, as well as trying to
-create objects holding metadata and uploading a simple XML file to
-store. The tester has proved very useful for finding flaws in our
-implementation, as well as flaws in the reference site and the
-specification.
-
-
The test document I uploaded is a summary of all the specification
-defects we have collected so far while implementing the web service.
-There are several unclear and conflicting parts of the specification,
-and we have
-started
-writing down the questions we get from implementing it. We use a
-format inspired by how The
-Austin Group collect defect reports for the POSIX standard with
-their
-instructions for the MANTIS defect tracker system, in lack of an official way to structure defect reports for Noark 5 (our first submitted defect report was a request for a procedure for submitting defect reports :).
-
-
The Nikita project is implemented using Java and Spring, and is
-fairly easy to get up and running using Docker containers for those
-that want to test the current code base. The API tester is
-implemented in Python.
+
+
2nd September 2018
+
I continue to explore my Kodi installation, and today I wanted to
+tell it to play a youtube URL I received in a chat, without having to
+insert search terms using the on-screen keyboard. After searching the
+web for API access to the Youtube plugin and testing a bit, I managed
+to find a recipe that worked. If you got a kodi instance with its API
+available from http://kodihost/jsonrpc, you can try the following to
+have check out a nice cover band.
+
+
curl --silent --header 'Content-Type: application/json' \
+ --data-binary '{ "id": 1, "jsonrpc": "2.0", "method": "Player.Open",
+ "params": {"item": { "file":
+ "plugin://plugin.video.youtube/play/?video_id=LuRGVM9O0qg" } } }' \
+ http://projector.local/jsonrpc
+
+
I've extended kodi-stream program to take a video source as its
+first argument. It can now handle direct video links, youtube links
+and 'desktop' to stream my desktop to Kodi. It is almost like a
+Chromecast. :)
+
+
As usual, if you use Bitcoin and want to show your support of my
+activities, please send Bitcoin donations to my address
+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
@@ -621,114 +719,19 @@ implemented in Python.
-
-
9th March 2017
-
Over the years, administrating thousand of NFS mounting linux
-computers at the time, I often needed a way to detect if the machine
-was experiencing NFS hang. If you try to use df or look at a
-file or directory affected by the hang, the process (and possibly the
-shell) will hang too. So you want to be able to detect this without
-risking the detection process getting stuck too. It has not been
-obvious how to do this. When the hang has lasted a while, it is
-possible to find messages like these in dmesg:
-
-
-nfs: server nfsserver not responding, still trying
-
nfs: server nfsserver OK
-
-
-
It is hard to know if the hang is still going on, and it is hard to
-be sure looking in dmesg is going to work. If there are lots of other
-messages in dmesg the lines might have rotated out of site before they
-are noticed.
-
-
While reading through the nfs client implementation in linux kernel
-code, I came across some statistics that seem to give a way to detect
-it. The om_timeouts sunrpc value in the kernel will increase every
-time the above log entry is inserted into dmesg. And after digging a
-bit further, I discovered that this value show up in
-/proc/self/mountstats on Linux.
-
-
The mountstats content seem to be shared between files using the
-same file system context, so it is enough to check one of the
-mountstats files to get the state of the mount point for the machine.
-I assume this will not show lazy umounted NFS points, nor NFS mount
-points in a different process context (ie with a different filesystem
-view), but that does not worry me.
-
-
The content for a NFS mount point look similar to this:
-
-
-[...]
-device /dev/mapper/Debian-var mounted on /var with fstype ext3
-device nfsserver:/mnt/nfsserver/home0 mounted on /mnt/nfsserver/home0 with fstype nfs statvers=1.1
- opts: rw,vers=3,rsize=65536,wsize=65536,namlen=255,acregmin=3,acregmax=60,acdirmin=30,acdirmax=60,soft,nolock,proto=tcp,timeo=600,retrans=2,sec=sys,mountaddr=129.240.3.145,mountvers=3,mountport=4048,mountproto=udp,local_lock=all
- age: 7863311
- caps: caps=0x3fe7,wtmult=4096,dtsize=8192,bsize=0,namlen=255
- sec: flavor=1,pseudoflavor=1
- events: 61063112 732346265 1028140 35486205 16220064 8162542 761447191 71714012 37189 3891185 45561809 110486139 4850138 420353 15449177 296502 52736725 13523379 0 52182 9016896 1231 0 0 0 0 0
- bytes: 166253035039 219519120027 0 0 40783504807 185466229638 11677877 45561809
- RPC iostats version: 1.0 p/v: 100003/3 (nfs)
- xprt: tcp 925 1 6810 0 0 111505412 111480497 109 2672418560317 0 248 53869103 22481820
- per-op statistics
- NULL: 0 0 0 0 0 0 0 0
- GETATTR: 61063106 61063108 0 9621383060 6839064400 453650 77291321 78926132
- SETATTR: 463469 463470 0 92005440 66739536 63787 603235 687943
- LOOKUP: 17021657 17021657 0 3354097764 4013442928 57216 35125459 35566511
- ACCESS: 14281703 14290009 5 2318400592 1713803640 1709282 4865144 7130140
- READLINK: 125 125 0 20472 18620 0 1112 1118
- READ: 4214236 4214237 0 715608524 41328653212 89884 22622768 22806693
- WRITE: 8479010 8494376 22 187695798568 1356087148 178264904 51506907 231671771
- CREATE: 171708 171708 0 38084748 46702272 873 1041833 1050398
- MKDIR: 3680 3680 0 773980 993920 26 23990 24245
- SYMLINK: 903 903 0 233428 245488 6 5865 5917
- MKNOD: 80 80 0 20148 21760 0 299 304
- REMOVE: 429921 429921 0 79796004 61908192 3313 2710416 2741636
- RMDIR: 3367 3367 0 645112 484848 22 5782 6002
- RENAME: 466201 466201 0 130026184 121212260 7075 5935207 5961288
- LINK: 289155 289155 0 72775556 67083960 2199 2565060 2585579
- READDIR: 2933237 2933237 0 516506204 13973833412 10385 3190199 3297917
- READDIRPLUS: 1652839 1652839 0 298640972 6895997744 84735 14307895 14448937
- FSSTAT: 6144 6144 0 1010516 1032192 51 9654 10022
- FSINFO: 2 2 0 232 328 0 1 1
- PATHCONF: 1 1 0 116 140 0 0 0
- COMMIT: 0 0 0 0 0 0 0 0
-
-device binfmt_misc mounted on /proc/sys/fs/binfmt_misc with fstype binfmt_misc
-[...]
-
-
-
The key number to look at is the third number in the per-op list.
-It is the number of NFS timeouts experiences per file system
-operation. Here 22 write timeouts and 5 access timeouts. If these
-numbers are increasing, I believe the machine is experiencing NFS
-hang. Unfortunately the timeout value do not start to increase right
-away. The NFS operations need to time out first, and this can take a
-while. The exact timeout value depend on the setup. For example the
-defaults for TCP and UDP mount points are quite different, and the
-timeout value is affected by the soft, hard, timeo and retrans NFS
-mount options.
-
-
The only way I have been able to get working on Debian and RedHat
-Enterprise Linux for getting the timeout count is to peek in /proc/.
-But according to
-Solaris
-10 System Administration Guide: Network Services, the 'nfsstat -c'
-command can be used to get these timeout values. But this do not work
-on Linux, as far as I can tell. I
-asked Debian about this,
-but have not seen any replies yet.
-
-
Is there a better way to figure out if a Linux NFS client is
-experiencing NFS hangs? Is there a way to detect which processes are
-affected? Is there a way to get the NFS mount going quickly once the
-network problem causing the NFS hang has been cleared? I would very
-much welcome some clues, as we regularly run into NFS hangs.
+
+
30th August 2018
+
It might seem obvious that software created using tax money should
+be available for everyone to use and improve. Free Software
+Foundation Europe recentlystarted a campaign to help get more people
+to understand this, and I just signed the petition on
+Public Money, Public Code to help
+them. I hope you too will do the same.
@@ -743,6 +746,31 @@ much welcome some clues, as we regularly run into NFS hangs.
Archive
+- 2018
+
+
- 2017
- 2016
@@ -1008,7 +1046,7 @@ much welcome some clues, as we regularly run into NFS hangs.
Tags
- - 3d-printer (13)
+ - 3d-printer (16)
- amiga (1)
@@ -1016,15 +1054,15 @@ much welcome some clues, as we regularly run into NFS hangs.
- bankid (4)
- - bitcoin (9)
+ - bitcoin (10)
- - bootsystem (16)
+ - bootsystem (17)
- bsa (2)
- chrpath (2)
- - debian (150)
+ - debian (164)
- debian edu (158)
@@ -1032,19 +1070,19 @@ much welcome some clues, as we regularly run into NFS hangs.
- digistan (10)
- - dld (16)
+ - dld (17)
- - docbook (24)
+ - docbook (25)
- drivstoffpriser (4)
- - english (350)
+ - english (391)
- fiksgatami (23)
- - fildeling (12)
+ - fildeling (13)
- - freeculture (30)
+ - freeculture (32)
- freedombox (9)
@@ -1054,12 +1092,16 @@ much welcome some clues, as we regularly run into NFS hangs.
- intervju (42)
- - isenkram (15)
+ - isenkram (16)
- kart (20)
+ - kodi (3)
+
- ldap (9)
+ - lego (4)
+
- lenker (8)
- lsdvd (2)
@@ -1068,23 +1110,23 @@ much welcome some clues, as we regularly run into NFS hangs.
- mesh network (8)
- - multimedia (39)
+ - multimedia (41)
- - nice free software (9)
+ - nice free software (11)
- - norsk (291)
+ - norsk (299)
- - nuug (189)
+ - nuug (191)
- - offentlig innsyn (33)
+ - offentlig innsyn (34)
- open311 (2)
- - opphavsrett (64)
+ - opphavsrett (72)
- - personvern (100)
+ - personvern (107)
- - raid (1)
+ - raid (2)
- reactos (1)
@@ -1096,35 +1138,37 @@ much welcome some clues, as we regularly run into NFS hangs.
- rss (1)
- - ruter (5)
+ - ruter (6)
- scraperwiki (2)
- - sikkerhet (53)
+ - sikkerhet (55)
- sitesummary (4)
- skepsis (5)
- - standard (55)
+ - standard (57)
- stavekontroll (6)
- - stortinget (11)
+ - stortinget (12)
- - surveillance (48)
+ - surveillance (55)
- - sysadmin (3)
+ - sysadmin (4)
- usenix (2)
- - valg (8)
+ - valg (9)
+
+ - verkidetfri (13)
- - video (59)
+ - video (69)
- vitenskap (4)
- - web (40)
+ - web (41)