X-Git-Url: http://pere.pagekite.me/gitweb/homepage.git/blobdiff_plain/22433eecbaebdec4de0af204d146da0d9722ac1f..b279793fc3681f16b43cc7dba38f0d6f4e22d247:/blog/index.html diff --git a/blog/index.html b/blog/index.html index 625e248011..133f41dd38 100644 --- a/blog/index.html +++ b/blog/index.html @@ -20,159 +20,59 @@
-
E-tjenesten ber om innsyn i eposten til partiene på Stortinget
-
6th September 2016
-

I helga kom det et hårreisende forslag fra Lysne II-utvalget satt -ned av Forsvarsdepartementet. Lysne II-utvalget var bedt om å vurdere -ønskelista til Forsvarets etterretningstjeneste (e-tjenesten), og har -kommet med -forslag -om at e-tjenesten skal få lov til a avlytte all Internett-trafikk -som passerer Norges grenser. Få er klar over at dette innebærer at -e-tjenesten får tilgang til epost sendt til de fleste politiske -partiene på Stortinget. Regjeringspartiet Høyre (@hoyre.no), -støttepartiene Venstre (@venstre.no) og Kristelig Folkeparti (@krf.no) -samt Sosialistisk Ventreparti (@sv.no) og Miljøpartiet de grønne -(@mdg.no) har nemlig alle valgt å ta imot eposten sin via utenlandske -tjenester. Det betyr at hvis noen sender epost til noen med en slik -adresse vil innholdet i eposten, om dette forslaget blir vedtatt, gjøres -tilgjengelig for e-tjenesten. Venstre, Sosialistisk Ventreparti og -Miljøpartiet De Grønne har valgt å motta sin epost hos Google, -Kristelig Folkeparti har valgt å motta sin epost hos Microsoft, og -Høyre har valgt å motta sin epost hos Comendo med mottak i Danmark og -Irland. Kun Arbeiderpartiet og Fremskrittspartiet har valgt å motta -eposten sin i Norge, hos henholdsvis Intility AS og Telecomputing -AS.

- -

Konsekvensen er at epost inn og ut av de politiske organisasjonene, -til og fra partimedlemmer og partiets tillitsvalgte vil gjøres -tilgjengelig for e-tjenesten for analyse og sortering. Jeg mistenker -at kunnskapen som slik blir tilgjengelig vil være nyttig hvis en -ønsker å vite hvilke argumenter som treffer publikum når en ønsker å -påvirke Stortingets representanter.

Ved hjelp av MX-oppslag i DNS for epost-domene, tilhørende -whois-oppslag av IP-adressene og traceroute for å se hvorvidt -trafikken går via utlandet kan enhver få bekreftet at epost sendt til -de omtalte partiene vil gjøres tilgjengelig for forsvarets -etterretningstjeneste hvis forslaget blir vedtatt. En kan også bruke -den kjekke nett-tjenesten ipinfo.io -for å få en ide om hvor i verden en IP-adresse hører til.

- -

På den positive siden vil forslaget gjøre at enda flere blir -motivert til å ta grep for å bruke -Tor og krypterte -kommunikasjonsløsninger for å kommunisere med sine kjære, for å sikre -at privatsfæren vernes. Selv bruker jeg blant annet -FreedomBox og -Signal til slikt. Ingen av -dem er optimale, men de fungerer ganske bra allerede og øker kostnaden -for dem som ønsker å invadere mitt privatliv.

- -

For øvrig burde varsleren Edward Snowden få politisk asyl i -Norge.

- - + +
10th June 2017
+

I am very happy to report that the +Nikita Noark 5 +core project tagged its second release today. The free software +solution is an implementation of the Norwegian archive standard Noark +5 used by government offices in Norway. These were the changes in +version 0.1.1 since version 0.1.0 (from NEWS.md): + +

    + +
  • Continued work on the angularjs GUI, including document upload.
  • +
  • Implemented correspondencepartPerson, correspondencepartUnit and + correspondencepartInternal
  • +
  • Applied for coverity coverage and started submitting code on + regualr basis.
  • +
  • Started fixing bugs reported by coverity
  • +
  • Corrected and completed HATEOAS links to make sure entire API is + available via URLs in _links.
  • +
  • Corrected all relation URLs to use trailing slash.
  • +
  • Add initial support for storing data in ElasticSearch.
  • +
  • Now able to receive and store uploaded files in the archive.
  • +
  • Changed JSON output for object lists to have relations in _links.
  • +
  • Improve JSON output for empty object lists.
  • +
  • Now uses correct MIME type application/vnd.noark5-v4+json.
  • +
  • Added support for docker container images.
  • +
  • Added simple API browser implemented in JavaScript/Angular.
  • +
  • Started on archive client implemented in JavaScript/Angular.
  • +
  • Started on prototype to show the public mail journal.
  • +
  • Improved performance by disabling Sprint FileWatcher.
  • +
  • Added support for 'arkivskaper', 'saksmappe' and 'journalpost'.
  • +
  • Added support for some metadata codelists.
  • +
  • Added support for Cross-origin resource sharing (CORS).
  • +
  • Changed login method from Basic Auth to JSON Web Token (RFC 7519) + style.
  • +
  • Added support for GET-ing ny-* URLs.
  • +
  • Added support for modifying entities using PUT and eTag.
  • +
  • Added support for returning XML output on request.
  • +
  • Removed support for English field and class names, limiting ourself + to the official names.
  • +
  • ...
  • + +
+ +

If this sound interesting to you, please contact us on IRC (#nikita +on irc.freenode.net) or email +(nikita-noark +mailing list).

@@ -180,33 +80,99 @@ traceroute to mx03.telecomputing.no (95.128.105.102), 30 hops max, 60 byte packe
- -
30th August 2016
-

In April we -started -to work on a Norwegian Bokmål edition of the "open access" book on -how to set up and administrate a Debian system. Today I am happy to -report that the first draft is now publicly available. You can find -it on get the Debian -Administrator's Handbook page (under Other languages). The first -eight chapters have a first draft translation, and we are working on -proofreading the content. If you want to help out, please start -contributing using -the -hosted weblate project page, and get in touch using -the -translators mailing list. Please also check out -the instructions for -contributors. A good way to contribute is to proofread the text -and update weblate if you find errors.

- -

Our goal is still to make the Norwegian book available on paper as well as -electronic form.

+ +
7th June 2017
+

This is a copy of +an +email I posted to the nikita-noark mailing list. Please follow up +there if you would like to discuss this topic. The background is that +we are making a free software archive system based on the Norwegian +Noark +5 standard for government archives.

+ +

I've been wondering a bit lately how trusted timestamps could be +stored in Noark 5. +Trusted +timestamps can be used to verify that some information +(document/file/checksum/metadata) have not been changed since a +specific time in the past. This is useful to verify the integrity of +the documents in the archive.

+ +

Then it occured to me, perhaps the trusted timestamps could be +stored as dokument variants (ie dokumentobjekt referered to from +dokumentbeskrivelse) with the filename set to the hash it is +stamping?

+ +

Given a "dokumentbeskrivelse" with an associated "dokumentobjekt", +a new dokumentobjekt is associated with "dokumentbeskrivelse" with the +same attributes as the stamped dokumentobjekt except these +attributes:

+ +
    + +
  • format -> "RFC3161" +
  • mimeType -> "application/timestamp-reply" +
  • formatDetaljer -> "<source URL for timestamp service>" +
  • filenavn -> "<sjekksum>.tsr" + +
+ +

This assume a service following +IETF RFC 3161 is +used, which specifiy the given MIME type for replies and the .tsr file +ending for the content of such trusted timestamp. As far as I can +tell from the Noark 5 specifications, it is OK to have several +variants/renderings of a dokument attached to a given +dokumentbeskrivelse objekt. It might be stretching it a bit to make +some of these variants represent crypto-signatures useful for +verifying the document integrity instead of representing the dokument +itself.

+ +

Using the source of the service in formatDetaljer allow several +timestamping services to be used. This is useful to spread the risk +of key compromise over several organisations. It would only be a +problem to trust the timestamps if all of the organisations are +compromised.

+ +

The following oneliner on Linux can be used to generate the tsr +file. $input is the path to the file to checksum, and $sha256 is the +SHA-256 checksum of the file (ie the ".tsr" value mentioned +above).

+ +

+openssl ts -query -data "$inputfile" -cert -sha256 -no_nonce \
+  | curl -s -H "Content-Type: application/timestamp-query" \
+      --data-binary "@-" http://zeitstempel.dfn.de > $sha256.tsr
+

+ +

To verify the timestamp, you first need to download the public key +of the trusted timestamp service, for example using this command:

+ +

+wget -O ca-cert.txt \
+  https://pki.pca.dfn.de/global-services-ca/pub/cacert/chain.txt
+

+ +

Note, the public key should be stored alongside the timestamps in +the archive to make sure it is also available 100 years from now. It +is probably a good idea to standardise how and were to store such +public keys, to make it easier to find for those trying to verify +documents 100 or 1000 years from now. :)

+ +

The verification itself is a simple openssl command:

+ +

+openssl ts -verify -data $inputfile -in $sha256.tsr \
+  -CAfile ca-cert.txt -text
+

+ +

Is there any reason this approach would not work? Is it somehow against +the Noark 5 specification?

@@ -214,75 +180,61 @@ electronic form.

- -
11th August 2016
-

This summer, I read a great article -"coz: -This Is the Profiler You're Looking For" in USENIX ;login: about -how to profile multi-threaded programs. It presented a system for -profiling software by running experiences in the running program, -testing how run time performance is affected by "speeding up" parts of -the code to various degrees compared to a normal run. It does this by -slowing down parallel threads while the "faster up" code is running -and measure how this affect processing time. The processing time is -measured using probes inserted into the code, either using progress -counters (COZ_PROGRESS) or as latency meters (COZ_BEGIN/COZ_END). It -can also measure unmodified code by measuring complete the program -runtime and running the program several times instead.

- -

The project and presentation was so inspiring that I would like to -get the system into Debian. I -created -a WNPP request for it and contacted upstream to try to make the -system ready for Debian by sending patches. The build process need to -be changed a bit to avoid running 'git clone' to get dependencies, and -to include the JavaScript web page used to visualize the collected -profiling information included in the source package. -But I expect that should work out fairly soon.

- -

The way the system work is fairly simple. To run an coz experiment -on a binary with debug symbols available, start the program like this: - -

-coz run --- program-to-run
-

- -

This will create a text file profile.coz with the instrumentation -information. To show what part of the code affect the performance -most, use a web browser and either point it to -http://plasma-umass.github.io/coz/ -or use the copy from git (in the gh-pages branch). Check out this web -site to have a look at several example profiling runs and get an idea what the end result from the profile runs look like. To make the -profiling more useful you include <coz.h> and insert the -COZ_PROGRESS or COZ_BEGIN and COZ_END at appropriate places in the -code, rebuild and run the profiler. This allow coz to do more -targeted experiments.

- -

A video published by ACM -presenting the -Coz profiler is available from Youtube. There is also a paper -from the 25th Symposium on Operating Systems Principles available -titled -Coz: -finding code that counts with causal profiling.

- -

The source code -for Coz is available from github. It will only build with clang -because it uses a -C++ -feature missing in GCC, but I've submitted -a patch to solve -it and hope it will be included in the upstream source soon.

- -

Please get in touch if you, like me, would like to see this piece -of software in Debian. I would very much like some help with the -packaging effort, as I lack the in depth knowledge on how to package -C++ libraries.

+ +
3rd June 2017
+

Aftenposten +melder i dag om feil i eksamensoppgavene for eksamen i politikk og +menneskerettigheter, der teksten i bokmåls og nynorskutgaven ikke var +like. Oppgaveteksten er gjengitt i artikkelen, og jeg ble nysgjerring +på om den fri oversetterløsningen +Apertium ville gjort en bedre +jobb enn Utdanningsdirektoratet. Det kan se slik ut.

+ +

Her er bokmålsoppgaven fra eksamenen:

+ +
+

Drøft utfordringene knyttet til nasjonalstatenes og andre aktørers +rolle og muligheter til å håndtere internasjonale utfordringer, som +for eksempel flykningekrisen.

+ +

Vedlegge er eksempler på tekster som kan gi relevante perspektiver +på temaet:

+
    +
  1. Flykningeregnskapet 2016, UNHCR og IDMC +
  2. «Grenseløst Europa for fall» A-Magasinet, 26. november 2015 +
+ +
+ +

Dette oversetter Apertium slik:

+ +
+

Drøft utfordringane knytte til nasjonalstatane sine og rolla til +andre aktørar og høve til å handtera internasjonale utfordringar, som +til dømes *flykningekrisen.

+ +

Vedleggja er døme på tekster som kan gje relevante perspektiv på +temaet:

+ +
    +
  1. *Flykningeregnskapet 2016, *UNHCR og *IDMC
  2. +
  3. «*Grenseløst Europa for fall» A-Magasinet, 26. november 2015
  4. +
+ +
+ +

Ord som ikke ble forstått er markert med stjerne (*), og trenger +ekstra språksjekk. Men ingen ord er forsvunnet, slik det var i +oppgaven elevene fikk presentert på eksamen. Jeg mistenker dog at +"andre aktørers rolle og muligheter til ..." burde vært oversatt til +"rolla til andre aktørar og deira høve til ..." eller noe slikt, men +det er kanskje flisespikking. Det understreker vel bare at det alltid +trengs korrekturlesning etter automatisk oversettelse.

@@ -290,58 +242,67 @@ C++ libraries.

- -
5th August 2016
-

As my regular readers probably remember, the last year I published -a French and Norwegian translation of the classic -Free Culture book by the -founder of the Creative Commons movement, Lawrence Lessig. A bit less -known is the fact that due to the way I created the translations, -using docbook and po4a, I also recreated the English original. And -because I already had created a new the PDF edition, I published it -too. The revenue from the books are sent to the Creative Commons -Corporation. In other words, I do not earn any money from this -project, I just earn the warm fuzzy feeling that the text is available -for a wider audience and more people can learn why the Creative -Commons is needed.

- -

Today, just for fun, I had a look at the sales number over at -Lulu.com, which take care of payment, printing and shipping. Much to -my surprise, the English edition is selling better than both the -French and Norwegian edition, despite the fact that it has been -available in English since it was first published. In total, 24 paper -books was sold for USD $19.99 between 2016-01-01 and 2016-07-31:

- - - - - - -
Title / languageQuantity
Culture Libre / French3
Fri kultur / Norwegian7
Free Culture / English14
- -

The books are available both from Lulu.com and from large book -stores like Amazon and Barnes&Noble. Most revenue, around $10 per -book, is sent to the Creative Commons project when the book is sold -directly by Lulu.com. The other channels give less revenue. The -summary from Lulu tell me 10 books was sold via the Amazon channel, 10 -via Ingram (what is this?) and 4 directly by Lulu. And Lulu.com tells -me that the revenue sent so far this year is USD $101.42. No idea -what kind of sales numbers to expect, so I do not know if that is a -good amount of sales for a 10 year old book or not. But it make me -happy that the buyers find the book, and I hope they enjoy reading it -as much as I did.

- -

The ebook edition is available for free from -Github.

- -

If you would like to translate and publish the book in your native -language, I would be happy to help make it happen. Please get in -touch.

+ +
27th April 2017
+

I disse dager, med frist 1. mai, har Riksarkivaren ute en høring på +sin forskrift. Som en kan se er det ikke mye tid igjen før fristen +som går ut på søndag. Denne forskriften er det som lister opp hvilke +formater det er greit å arkivere i +Noark +5-løsninger i Norge.

+ +

Jeg fant høringsdokumentene hos +Norsk +Arkivråd etter å ha blitt tipset på epostlisten til +fri +programvareprosjektet Nikita Noark5-Core, som lager et Noark 5 +Tjenestegresesnitt. Jeg er involvert i Nikita-prosjektet og takket +være min interesse for tjenestegrensesnittsprosjektet har jeg lest en +god del Noark 5-relaterte dokumenter, og til min overraskelse oppdaget +at standard epost ikke er på listen over godkjente formater som kan +arkiveres. Høringen med frist søndag er en glimrende mulighet til å +forsøke å gjøre noe med det. Jeg holder på med +egen +høringsuttalelse, og lurer på om andre er interessert i å støtte +forslaget om å tillate arkivering av epost som epost i arkivet.

+ +

Er du igang med å skrive egen høringsuttalelse allerede? I så fall +kan du jo vurdere å ta med en formulering om epost-lagring. Jeg tror +ikke det trengs så mye. Her et kort forslag til tekst:

+ +

+ +

Viser til høring sendt ut 2017-02-17 (Riksarkivarens referanse + 2016/9840 HELHJO), og tillater oss å sende inn noen innspill om + revisjon av Forskrift om utfyllende tekniske og arkivfaglige + bestemmelser om behandling av offentlige arkiver (Riksarkivarens + forskrift).

+ +

Svært mye av vår kommuikasjon foregår i dag på e-post.  Vi + foreslår derfor at Internett-e-post, slik det er beskrevet i IETF + RFC 5322, + https://tools.ietf.org/html/rfc5322. bør + inn som godkjent dokumentformat.  Vi foreslår at forskriftens + oversikt over godkjente dokumentformater ved innlevering i § 5-16 + endres til å ta med Internett-e-post.

+ +

+ +

Som del av arbeidet med tjenestegrensesnitt har vi testet hvordan +epost kan lagres i en Noark 5-struktur, og holder på å skrive et +forslag om hvordan dette kan gjøres som vil bli sendt over til +arkivverket så snart det er ferdig. De som er interesserte kan +følge +fremdriften på web.

+ +

Oppdatering 2017-04-28: I dag ble høringuttalelsen jeg skrev + sendt + inn av foreningen NUUG.

@@ -349,40 +310,52 @@ touch.

- -
1st August 2016
-

For mange år siden leste jeg en klassisk tekst som gjorde såpass -inntrykk på meg at jeg husker den fortsatt, flere år senere, og bruker -argumentene fra den stadig vekk. Teksten var «The Relativity of -Wrong» som Isaac Asimov publiserte i Skeptical Inquirer i 1989. Den -gir litt perspektiv rundt formidlingen av vitenskapelige resultater. -Jeg har hatt lyst til å kunne dele den også med folk som ikke -behersker engelsk så godt, som barn og noen av mine eldre slektninger, -og har savnet å ha den tilgjengelig på norsk. For to uker siden tok -jeg meg sammen og kontaktet Asbjørn Dyrendal i foreningen Skepsis om -de var interessert i å publisere en norsk utgave på bloggen sin, og da -han var positiv tok jeg kontakt med Skeptical Inquirer og spurte om -det var greit for dem. I løpet av noen dager fikk vi tilbakemelding -fra Barry Karr hos The Skeptical Inquirer som hadde sjekket og fått OK -fra Robyn Asimov som representerte arvingene i Asmiov-familien og gikk -igang med oversettingen.

- -

Resultatet, «Relativt -feil», ble publisert på skepsis-bloggen for noen minutter siden. -Jeg anbefaler deg på det varmeste å lese denne teksten og dele den med -dine venner.

- -

For å håndtere oversettelsen og sikre at original og oversettelse -var i sync brukte vi git, po4a, GNU make og Transifex. Det hele -fungerte utmerket og gjorde det enkelt å dele tekstene og jobbe sammen -om finpuss på formuleringene. Hadde hosted.weblate.org latt meg -opprette nye prosjekter selv i stedet for å måtte kontakte -administratoren der, så hadde jeg brukt weblate i stedet.

+ +
20th April 2017
+

Jeg oppdaget i dag at nettstedet som +publiserer offentlige postjournaler fra statlige etater, OEP, har +begynt å blokkerer enkelte typer webklienter fra å få tilgang. Vet +ikke hvor mange det gjelder, men det gjelder i hvert fall libwww-perl +og curl. For å teste selv, kjør følgende:

+ +
+% curl -v -s https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
+< HTTP/1.1 404 Not Found
+% curl -v -s --header 'User-Agent:Opera/12.0' https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
+< HTTP/1.1 200 OK
+%
+
+ +

Her kan en se at tjenesten gir «404 Not Found» for curl i +standardoppsettet, mens den gir «200 OK» hvis curl hevder å være Opera +versjon 12.0. Offentlig elektronisk postjournal startet blokkeringen +2017-03-02.

+ +

Blokkeringen vil gjøre det litt vanskeligere å maskinelt hente +informasjon fra oep.no. Kan blokkeringen være gjort for å hindre +automatisert innsamling av informasjon fra OEP, slik Pressens +Offentlighetsutvalg gjorde for å dokumentere hvordan departementene +hindrer innsyn i +rapporten +«Slik hindrer departementer innsyn» som ble publiserte i januar +2017. Det virker usannsynlig, da det jo er trivielt å bytte +User-Agent til noe nytt.

+ +

Finnes det juridisk grunnlag for det offentlige å diskriminere +webklienter slik det gjøres her? Der tilgang gis eller ikke alt etter +hva klienten sier at den heter? Da OEP eies av DIFI og driftes av +Basefarm, finnes det kanskje noen dokumenter sendt mellom disse to +aktørene man kan be om innsyn i for å forstå hva som har skjedd. Men +postjournalen +til DIFI viser kun to dokumenter det siste året mellom DIFI og +Basefarm. +Mimes brønn neste, +tenker jeg.

- Tags: norsk, skepsis. + Tags: norsk, offentlig innsyn.
@@ -390,51 +363,101 @@ administratoren der, så hadde jeg brukt weblate i stedet.

- -
1st August 2016
-

Did you know there is a TV channel broadcasting talks from DebConf -16 across an entire country? Or that there is a TV channel -broadcasting talks by or about -Linus Torvalds, -Tor, -OpenID, -Common Lisp, -Civic Tech, -EFF founder John Barlow, -how to make 3D -printer electronics and many more fascinating topics? It works -using only free software (all of it -available from Github), and -is administrated using a web browser and a web API.

- -

The TV channel is the Norwegian open channel -Frikanalen, and I am involved -via the NUUG member association in -running and developing the software for the channel. The channel is -organised as a member organisation where its members can upload and -broadcast what they want (think of it as Youtube for national -broadcasting television). Individuals can broadcast too. The time -slots are handled on a first come, first serve basis. Because the -channel have almost no viewers and very few active members, we can -experiment with TV technology without too much flack when we make -mistakes. And thanks to the few active members, most of the slots on -the schedule are free. I see this as an opportunity to spread -knowledge about technology and free software, and have a script I run -regularly to fill up all the open slots the next few days with -technology related video. The end result is a channel I like to -describe as Techno TV - filled with interesting talks and -presentations.

- -

It is available on channel 50 on the Norwegian national digital TV -network (RiksTV). It is also available as a multicast stream on -Uninett. And finally, it is available as -a WebM unicast stream from -Frikanalen and NUUG. Check it out. :)

+ +
19th March 2017
+

The Nikita +Noark 5 core project is implementing the Norwegian standard for +keeping an electronic archive of government documents. +The +Noark 5 standard document the requirement for data systems used by +the archives in the Norwegian government, and the Noark 5 web interface +specification document a REST web service for storing, searching and +retrieving documents and metadata in such archive. I've been involved +in the project since a few weeks before Christmas, when the Norwegian +Unix User Group +announced +it supported the project. I believe this is an important project, +and hope it can make it possible for the government archives in the +future to use free software to keep the archives we citizens depend +on. But as I do not hold such archive myself, personally my first use +case is to store and analyse public mail journal metadata published +from the government. I find it useful to have a clear use case in +mind when developing, to make sure the system scratches one of my +itches.

+ +

If you would like to help make sure there is a free software +alternatives for the archives, please join our IRC channel +(#nikita on +irc.freenode.net) and +the +project mailing list.

+ +

When I got involved, the web service could store metadata about +documents. But a few weeks ago, a new milestone was reached when it +became possible to store full text documents too. Yesterday, I +completed an implementation of a command line tool +archive-pdf to upload a PDF file to the archive using this +API. The tool is very simple at the moment, and find existing +fonds, series and +files while asking the user to select which one to use if more than +one exist. Once a file is identified, the PDF is associated with the +file and uploaded, using the title extracted from the PDF itself. The +process is fairly similar to visiting the archive, opening a cabinet, +locating a file and storing a piece of paper in the archive. Here is +a test run directly after populating the database with test data using +our API tester:

+ +

+~/src//noark5-tester$ ./archive-pdf mangelmelding/mangler.pdf
+using arkiv: Title of the test fonds created 2017-03-18T23:49:32.103446
+using arkivdel: Title of the test series created 2017-03-18T23:49:32.103446
+
+ 0 - Title of the test case file created 2017-03-18T23:49:32.103446
+ 1 - Title of the test file created 2017-03-18T23:49:32.103446
+Select which mappe you want (or search term): 0
+Uploading mangelmelding/mangler.pdf
+  PDF title: Mangler i spesifikasjonsdokumentet for NOARK 5 Tjenestegrensesnitt
+  File 2017/1: Title of the test case file created 2017-03-18T23:49:32.103446
+~/src//noark5-tester$
+

+ +

You can see here how the fonds (arkiv) and serie (arkivdel) only had +one option, while the user need to choose which file (mappe) to use +among the two created by the API tester. The archive-pdf +tool can be found in the git repository for the API tester.

+ +

In the project, I have been mostly working on +the API +tester so far, while getting to know the code base. The API +tester currently use +the HATEOAS links +to traverse the entire exposed service API and verify that the exposed +operations and objects match the specification, as well as trying to +create objects holding metadata and uploading a simple XML file to +store. The tester has proved very useful for finding flaws in our +implementation, as well as flaws in the reference site and the +specification.

+ +

The test document I uploaded is a summary of all the specification +defects we have collected so far while implementing the web service. +There are several unclear and conflicting parts of the specification, +and we have +started +writing down the questions we get from implementing it. We use a +format inspired by how The +Austin Group collect defect reports for the POSIX standard with +their +instructions for the MANTIS defect tracker system, in lack of an official way to structure defect reports for Noark 5 (our first submitted defect report was a request for a procedure for submitting defect reports :). + +

The Nikita project is implemented using Java and Spring, and is +fairly easy to get up and running using Docker containers for those +that want to test the current code base. The API tester is +implemented in Python.

@@ -442,90 +465,114 @@ Frikanalen and NUUG. Check it out. :)

- -
7th July 2016
-

Yesterday, I tried to unlock a HTC Desire HD phone, and it proved -to be a slight challenge. Here is the recipe if I ever need to do it -again. It all started by me wanting to try the recipe to set up -an -hardened Android installation from the Tor project blog on a -device I had access to. It is a old mobile phone with a broken -microphone The initial idea had been to just -install -CyanogenMod on it, but did not quite find time to start on it -until a few days ago.

- -

The unlock process is supposed to be simple: (1) Boot into the boot -loader (press volume down and power at the same time), (2) select -'fastboot' before (3) connecting the device via USB to a Linux -machine, (4) request the device identifier token by running 'fastboot -oem get_identifier_token', (5) request the device unlocking key using -the HTC developer web -site and unlock the phone using the key file emailed to you.

- -

Unfortunately, this only work fi you have hboot version 2.00.0029 -or newer, and the device I was working on had 2.00.0027. This -apparently can be easily fixed by downloading a Windows program and -running it on your Windows machine, if you accept the terms Microsoft -require you to accept to use Windows - which I do not. So I had to -come up with a different approach. I got a lot of help from AndyCap -on #nuug, and would not have been able to get this working without -him.

- -

First I needed to extract the hboot firmware from -the -windows binary for HTC Desire HD downloaded as 'the RUU' from HTC. -For this there is is a github -project named unruu using libunshield. The unshield tool did not -recognise the file format, but unruu worked and extracted rom.zip, -containing the new hboot firmware and a text file describing which -devices it would work for.

- -

Next, I needed to get the new firmware into the device. For this I -followed some instructions -available -from HTC1Guru.com, and ran these commands as root on a Linux -machine with Debian testing:

- -

-adb reboot-bootloader
-fastboot oem rebootRUU
-fastboot flash zip rom.zip
-fastboot flash zip rom.zip
-fastboot reboot
-

- -

The flash command apparently need to be done twice to take effect, -as the first is just preparations and the second one do the flashing. -The adb command is just to get to the boot loader menu, so turning the -device on while holding volume down and the power button should work -too.

- -

With the new hboot version in place I could start following the -instructions on the HTC developer web site. I got the device token -like this:

- -

-fastboot oem get_identifier_token 2>&1 | sed 's/(bootloader) //'
-
- -

And once I got the unlock code via email, I could use it like -this:

- -

-fastboot flash unlocktoken Unlock_code.bin
-

- -

And with that final step in place, the phone was unlocked and I -could start stuffing the software of my own choosing into the device. -So far I only inserted a replacement recovery image to wipe the phone -before I start. We will see what happen next. Perhaps I should -install Debian on it. :)

+ +
9th March 2017
+

Over the years, administrating thousand of NFS mounting linux +computers at the time, I often needed a way to detect if the machine +was experiencing NFS hang. If you try to use df or look at a +file or directory affected by the hang, the process (and possibly the +shell) will hang too. So you want to be able to detect this without +risking the detection process getting stuck too. It has not been +obvious how to do this. When the hang has lasted a while, it is +possible to find messages like these in dmesg:

+ +

+nfs: server nfsserver not responding, still trying +
nfs: server nfsserver OK +

+ +

It is hard to know if the hang is still going on, and it is hard to +be sure looking in dmesg is going to work. If there are lots of other +messages in dmesg the lines might have rotated out of site before they +are noticed.

+ +

While reading through the nfs client implementation in linux kernel +code, I came across some statistics that seem to give a way to detect +it. The om_timeouts sunrpc value in the kernel will increase every +time the above log entry is inserted into dmesg. And after digging a +bit further, I discovered that this value show up in +/proc/self/mountstats on Linux.

+ +

The mountstats content seem to be shared between files using the +same file system context, so it is enough to check one of the +mountstats files to get the state of the mount point for the machine. +I assume this will not show lazy umounted NFS points, nor NFS mount +points in a different process context (ie with a different filesystem +view), but that does not worry me.

+ +

The content for a NFS mount point look similar to this:

+ +

+[...]
+device /dev/mapper/Debian-var mounted on /var with fstype ext3
+device nfsserver:/mnt/nfsserver/home0 mounted on /mnt/nfsserver/home0 with fstype nfs statvers=1.1
+        opts:   rw,vers=3,rsize=65536,wsize=65536,namlen=255,acregmin=3,acregmax=60,acdirmin=30,acdirmax=60,soft,nolock,proto=tcp,timeo=600,retrans=2,sec=sys,mountaddr=129.240.3.145,mountvers=3,mountport=4048,mountproto=udp,local_lock=all
+        age:    7863311
+        caps:   caps=0x3fe7,wtmult=4096,dtsize=8192,bsize=0,namlen=255
+        sec:    flavor=1,pseudoflavor=1
+        events: 61063112 732346265 1028140 35486205 16220064 8162542 761447191 71714012 37189 3891185 45561809 110486139 4850138 420353 15449177 296502 52736725 13523379 0 52182 9016896 1231 0 0 0 0 0 
+        bytes:  166253035039 219519120027 0 0 40783504807 185466229638 11677877 45561809 
+        RPC iostats version: 1.0  p/v: 100003/3 (nfs)
+        xprt:   tcp 925 1 6810 0 0 111505412 111480497 109 2672418560317 0 248 53869103 22481820
+        per-op statistics
+                NULL: 0 0 0 0 0 0 0 0
+             GETATTR: 61063106 61063108 0 9621383060 6839064400 453650 77291321 78926132
+             SETATTR: 463469 463470 0 92005440 66739536 63787 603235 687943
+              LOOKUP: 17021657 17021657 0 3354097764 4013442928 57216 35125459 35566511
+              ACCESS: 14281703 14290009 5 2318400592 1713803640 1709282 4865144 7130140
+            READLINK: 125 125 0 20472 18620 0 1112 1118
+                READ: 4214236 4214237 0 715608524 41328653212 89884 22622768 22806693
+               WRITE: 8479010 8494376 22 187695798568 1356087148 178264904 51506907 231671771
+              CREATE: 171708 171708 0 38084748 46702272 873 1041833 1050398
+               MKDIR: 3680 3680 0 773980 993920 26 23990 24245
+             SYMLINK: 903 903 0 233428 245488 6 5865 5917
+               MKNOD: 80 80 0 20148 21760 0 299 304
+              REMOVE: 429921 429921 0 79796004 61908192 3313 2710416 2741636
+               RMDIR: 3367 3367 0 645112 484848 22 5782 6002
+              RENAME: 466201 466201 0 130026184 121212260 7075 5935207 5961288
+                LINK: 289155 289155 0 72775556 67083960 2199 2565060 2585579
+             READDIR: 2933237 2933237 0 516506204 13973833412 10385 3190199 3297917
+         READDIRPLUS: 1652839 1652839 0 298640972 6895997744 84735 14307895 14448937
+              FSSTAT: 6144 6144 0 1010516 1032192 51 9654 10022
+              FSINFO: 2 2 0 232 328 0 1 1
+            PATHCONF: 1 1 0 116 140 0 0 0
+              COMMIT: 0 0 0 0 0 0 0 0
+
+device binfmt_misc mounted on /proc/sys/fs/binfmt_misc with fstype binfmt_misc
+[...]
+

+ +

The key number to look at is the third number in the per-op list. +It is the number of NFS timeouts experiences per file system +operation. Here 22 write timeouts and 5 access timeouts. If these +numbers are increasing, I believe the machine is experiencing NFS +hang. Unfortunately the timeout value do not start to increase right +away. The NFS operations need to time out first, and this can take a +while. The exact timeout value depend on the setup. For example the +defaults for TCP and UDP mount points are quite different, and the +timeout value is affected by the soft, hard, timeo and retrans NFS +mount options.

+ +

The only way I have been able to get working on Debian and RedHat +Enterprise Linux for getting the timeout count is to peek in /proc/. +But according to +Solaris +10 System Administration Guide: Network Services, the 'nfsstat -c' +command can be used to get these timeout values. But this do not work +on Linux, as far as I can tell. I +asked Debian about this, +but have not seen any replies yet.

+ +

Is there a better way to figure out if a Linux NFS client is +experiencing NFS hangs? Is there a way to detect which processes are +affected? Is there a way to get the NFS mount going quickly once the +network problem causing the NFS hang has been cleared? I would very +much welcome some clues, as we regularly run into NFS hangs.

@@ -533,112 +580,44 @@ install Debian on it. :)

- -
3rd July 2016
-

For a while now, I have wanted to test -the Signal app, as it is -said to provide end to end encrypted communication and several of my -friends and family are already using it. As I by choice do not own a -mobile phone, this proved to be harder than expected. And I wanted to -have the source of the client and know that it was the code used on my -machine. But yesterday I managed to get it working. I used the -Github source, compared it to the source in -the -Signal Chrome app available from the Chrome web store, applied -patches to use the production Signal servers, started the app and -asked for the hidden "register without a smart phone" form. Here is -the recipe how I did it.

- -

First, I fetched the Signal desktop source from Github, using - -

-git clone https://github.com/WhisperSystems/Signal-Desktop.git
-
- -

Next, I patched the source to use the production servers, to be -able to talk to other Signal users:

- -
-cat <<EOF | patch -p0
-diff -ur ./js/background.js userdata/Default/Extensions/bikioccmkafdpakkkcpdbppfkghcmihk/0.15.0_0/js/background.js
---- ./js/background.js  2016-06-29 13:43:15.630344628 +0200
-+++ userdata/Default/Extensions/bikioccmkafdpakkkcpdbppfkghcmihk/0.15.0_0/js/background.js    2016-06-29 14:06:29.530300934 +0200
-@@ -47,8 +47,8 @@
-         });
-     });
- 
--    var SERVER_URL = 'https://textsecure-service-staging.whispersystems.org';
--    var ATTACHMENT_SERVER_URL = 'https://whispersystems-textsecure-attachments-staging.s3.amazonaws.com';
-+    var SERVER_URL = 'https://textsecure-service-ca.whispersystems.org:4433';
-+    var ATTACHMENT_SERVER_URL = 'https://whispersystems-textsecure-attachments.s3.amazonaws.com';
-     var messageReceiver;
-     window.getSocketStatus = function() {
-         if (messageReceiver) {
-diff -ur ./js/expire.js userdata/Default/Extensions/bikioccmkafdpakkkcpdbppfkghcmihk/0.15.0_0/js/expire.js
---- ./js/expire.js      2016-06-29 13:43:15.630344628 +0200
-+++ userdata/Default/Extensions/bikioccmkafdpakkkcpdbppfkghcmihk/0.15.0_0/js/expire.js2016-06-29 14:06:29.530300934 +0200
-@@ -1,6 +1,6 @@
- ;(function() {
-     'use strict';
--    var BUILD_EXPIRATION = 0;
-+    var BUILD_EXPIRATION = 1474492690000;
- 
-     window.extension = window.extension || {};
- 
-EOF
-
- -

The first part is changing the servers, and the second is updating -an expiration timestamp. This timestamp need to be updated regularly. -It is set 90 days in the future by the build process (Gruntfile.js). -The value is seconds since 1970 times 1000, as far as I can tell.

- -

Based on a tip and good help from the #nuug IRC channel, I wrote a -script to launch Signal in Chromium.

- -
-#!/bin/sh
-cd $(dirname $0)
-mkdir -p userdata
-exec chromium \
-  --proxy-server="socks://localhost:9050" \
-  --user-data-dir=`pwd`/userdata --load-and-launch-app=`pwd`
-
- -

The script start the app and configure Chromium to use the Tor -SOCKS5 proxy to make sure those controlling the Signal servers (today -Amazon and Whisper Systems) as well as those listening on the lines -will have a harder time location my laptop based on the Signal -connections if they use source IP address.

- -

When the script starts, one need to follow the instructions under -"Standalone Registration" in the CONTRIBUTING.md file in the git -repository. I right clicked on the Signal window to get up the -Chromium debugging tool, visited the 'Console' tab and wrote -'extension.install("standalone")' on the console prompt to get the -registration form. Then I entered by land line phone number and -pressed 'Call'. 5 seconds later the phone rang and a robot voice -repeated the verification code three times. After entering the number -into the verification code field in the form, I could start using -Signal from my laptop. - -

As far as I can tell, The Signal app will leak who is talking to -whom and thus who know who to those controlling the central server, -but such leakage is hard to avoid with a centrally controlled server -setup. It is something to keep in mind when using Signal - the -content of your chats are harder to intercept, but the meta data -exposing your contact network is available to people you do not know. -So better than many options, but not great. And sadly the usage is -connected to my land line, thus allowing those controlling the server -to associate it to my home and person. I would prefer it if only -those I knew could tell who I was on Signal. There are options -avoiding such information leakage, but most of my friends are not -using them, so I am stuck with Signal for now.

+ +
8th March 2017
+

So the new president in the United States of America claim to be +surprised to discover that he was wiretapped during the election +before he was elected president. He even claim this must be illegal. +Well, doh, if it is one thing the confirmations from Snowden +documented, it is that the entire population in USA is wiretapped, one +way or another. Of course the president candidates were wiretapped, +alongside the senators, judges and the rest of the people in USA.

+ +

Next, the Federal Bureau of Investigation ask the Department of +Justice to go public rejecting the claims that Donald Trump was +wiretapped illegally. I fail to see the relevance, given that I am +sure the surveillance industry in USA believe they have all the legal +backing they need to conduct mass surveillance on the entire +world.

+ +

There is even the director of the FBI stating that he never saw an +order requesting wiretapping of Donald Trump. That is not very +surprising, given how the FISA court work, with all its activity being +secret. Perhaps he only heard about it?

+ +

What I find most sad in this story is how Norwegian journalists +present it. In a news reports the other day in the radio from the +Norwegian National broadcasting Company (NRK), I heard the journalist +claim that 'the FBI denies any wiretapping', while the reality is that +'the FBI denies any illegal wiretapping'. There is a fundamental and +important difference, and it make me sad that the journalists are +unable to grasp it.

+ +

Update 2017-03-13: Look like +The +Intercept report that US Senator Rand Paul confirm what I state above.

@@ -646,46 +625,33 @@ using them, so I am stuck with Signal for now.

- -
6th June 2016
-

When I set out a few weeks ago to figure out -which -multimedia player in Debian claimed to support most file formats / -MIME types, I was a bit surprised how varied the sets of MIME types -the various players claimed support for. The range was from 55 to 130 -MIME types. I suspect most media formats are supported by all -players, but this is not really reflected in the MimeTypes values in -their desktop files. There are probably also some bogus MIME types -listed, but it is hard to identify which one this is.

- -

Anyway, in the mean time I got in touch with upstream for some of -the players suggesting to add more MIME types to their desktop files, -and decided to spend some time myself improving the situation for my -favorite media player VLC. The fixes for VLC entered Debian unstable -yesterday. The complete list of MIME types can be seen on the -Multimedia -player MIME type support status Debian wiki page.

- -

The new "best" multimedia player in Debian? It is VLC, followed by -totem, parole, kplayer, gnome-mpv, mpv, smplayer, mplayer-gui and -kmplayer. I am sure some of the other players desktop files support -several of the formats currently listed as working only with vlc, -toten and parole.

- -

A sad observation is that only 14 MIME types are listed as -supported by all the tested multimedia players in Debian in their -desktop files: audio/mpeg, audio/vnd.rn-realaudio, audio/x-mpegurl, -audio/x-ms-wma, audio/x-scpls, audio/x-wav, video/mp4, video/mpeg, -video/quicktime, video/vnd.rn-realvideo, video/x-matroska, -video/x-ms-asf, video/x-ms-wmv and video/x-msvideo. Personally I find -it sad that video/ogg and video/webm is not supported by all the media -players in Debian. As far as I can tell, all of them can handle both -formats.

+ +
3rd March 2017
+

For almost a year now, we have been working on making a Norwegian +Bokmål edition of The Debian +Administrator's Handbook. Now, thanks to the tireless effort of +Ole-Erik, Ingrid and Andreas, the initial translation is complete, and +we are working on the proof reading to ensure consistent language and +use of correct computer science terms. The plan is to make the book +available on paper, as well as in electronic form. For that to +happen, the proof reading must be completed and all the figures need +to be translated. If you want to help out, get in touch.

+ +

A + +fresh PDF edition in A4 format (the final book will have smaller +pages) of the book created every morning is available for +proofreading. If you find any errors, please +visit +Weblate and correct the error. The +state +of the translation including figures is a useful source for those +provide Norwegian bokmål screen shots and figures.

@@ -693,113 +659,72 @@ formats.

- -
5th June 2016
-

Many years ago, when koffice was fresh and with few users, I -decided to test its presentation tool when making the slides for a -talk I was giving for NUUG on Japhar, a free Java virtual machine. I -wrote the first draft of the slides, saved the result and went to bed -the day before I would give the talk. The next day I took a plane to -the location where the meeting should take place, and on the plane I -started up koffice again to polish the talk a bit, only to discover -that kpresenter refused to load its own data file. I cursed a bit and -started making the slides again from memory, to have something to -present when I arrived. I tested that the saved files could be -loaded, and the day seemed to be rescued. I continued to polish the -slides until I suddenly discovered that the saved file could no longer -be loaded into kpresenter. In the end I had to rewrite the slides -three times, condensing the content until the talk became shorter and -shorter. After the talk I was able to pinpoint the problem – -kpresenter wrote inline images in a way itself could not understand. -Eventually that bug was fixed and kpresenter ended up being a great -program to make slides. The point I'm trying to make is that we -expect a program to be able to load its own data files, and it is -embarrassing to its developers if it can't.

- -

Did you ever experience a program failing to load its own data -files from the desktop file browser? It is not a uncommon problem. A -while back I discovered that the screencast recorder -gtk-recordmydesktop would save an Ogg Theora video file the KDE file -browser would refuse to open. No video player claimed to understand -such file. I tracked down the cause being file --mime-type -returning the application/ogg MIME type, which no video player I had -installed listed as a MIME type they would understand. I asked for -file to change its -behavour and use the MIME type video/ogg instead. I also asked -several video players to add video/ogg to their desktop files, to give -the file browser an idea what to do about Ogg Theora files. After a -while, the desktop file browsers in Debian started to handle the -output from gtk-recordmydesktop properly.

- -

But history repeats itself. A few days ago I tested the music -system Rosegarden again, and I discovered that the KDE and xfce file -browsers did not know what to do with the Rosegarden project files -(*.rg). I've reported the -rosegarden problem to BTS and a fix is commited to git and will be -included in the next upload. To increase the chance of me remembering -how to fix the problem next time some program fail to load its files -from the file browser, here are some notes on how to fix it.

- -

The file browsers in Debian in general operates on MIME types. -There are two sources for the MIME type of a given file. The output from -file --mime-type mentioned above, and the content of the -shared MIME type registry (under /usr/share/mime/). The file MIME -type is mapped to programs supporting the MIME type, and this -information is collected from -the -desktop files available in /usr/share/applications/. If there is -one desktop file claiming support for the MIME type of the file, it is -activated when asking to open a given file. If there are more, one -can normally select which one to use by right-clicking on the file and -selecting the wanted one using 'Open with' or similar. In general -this work well. But it depend on each program picking a good MIME -type (preferably -a -MIME type registered with IANA), file and/or the shared MIME -registry recognizing the file and the desktop file to list the MIME -type in its list of supported MIME types.

- -

The /usr/share/mime/packages/rosegarden.xml entry for -the -Shared MIME database look like this:

- -

-<?xml version="1.0" encoding="UTF-8"?>
-<mime-info xmlns="http://www.freedesktop.org/standards/shared-mime-info">
-  <mime-type type="audio/x-rosegarden">
-    <sub-class-of type="application/x-gzip"/>
-    <comment>Rosegarden project file</comment>
-    <glob pattern="*.rg"/>
-  </mime-type>
-</mime-info>
-

- -

This states that audio/x-rosegarden is a kind of application/x-gzip -(it is a gzipped XML file). Note, it is much better to use an -official MIME type registered with IANA than it is to make up ones own -unofficial ones like the x-rosegarden type used by rosegarden.

- -

The desktop file of the rosegarden program failed to list -audio/x-rosegarden in its list of supported MIME types, causing the -file browsers to have no idea what to do with *.rg files:

- -

-% grep Mime /usr/share/applications/rosegarden.desktop
-MimeType=audio/x-rosegarden-composition;audio/x-rosegarden-device;audio/x-rosegarden-project;audio/x-rosegarden-template;audio/midi;
-X-KDE-NativeMimeType=audio/x-rosegarden-composition
+      
+      
1st March 2017
+

A few days ago I ordered a small batch of +the ChaosKey, a small +USB dongle for generating entropy created by Bdale Garbee and Keith +Packard. Yesterday it arrived, and I am very happy to report that it +work great! According to its designers, to get it to work out of the +box, you need the Linux kernel version 4.1 or later. I tested on a +Debian Stretch machine (kernel version 4.9), and there it worked just +fine, increasing the available entropy very quickly. I wrote a small +test oneliner to test. It first print the current entropy level, +drain /dev/random, and then print the entropy level for five seconds. +Here is the situation without the ChaosKey inserted:

+ +
+% cat /proc/sys/kernel/random/entropy_avail; \
+  dd bs=1M if=/dev/random of=/dev/null count=1; \
+  for n in $(seq 1 5); do \
+     cat /proc/sys/kernel/random/entropy_avail; \
+     sleep 1; \
+  done
+300
+0+1 oppføringer inn
+0+1 oppføringer ut
+28 byte kopiert, 0,000264565 s, 106 kB/s
+4
+8
+12
+17
+21
 %
-

+
+ +

The entropy level increases by 3-4 every second. In such case any +application requiring random bits (like a HTTPS enabled web server) +will halt and wait for more entrpy. And here is the situation with +the ChaosKey inserted:

+ +
+% cat /proc/sys/kernel/random/entropy_avail; \
+  dd bs=1M if=/dev/random of=/dev/null count=1; \
+  for n in $(seq 1 5); do \
+     cat /proc/sys/kernel/random/entropy_avail; \
+     sleep 1; \
+  done
+1079
+0+1 oppføringer inn
+0+1 oppføringer ut
+104 byte kopiert, 0,000487647 s, 213 kB/s
+433
+1028
+1031
+1035
+1038
+%
+
-

The fix was to add "audio/x-rosegarden;" at the end of the -MimeType= line.

+

Quite the difference. :) I bought a few more than I need, in case +someone want to buy one here in Norway. :)

-

If you run into a file which fail to open the correct program when -selected from the file browser, please check out the output from -file --mime-type for the file, ensure the file ending and -MIME type is registered somewhere under /usr/share/mime/ and check -that some desktop file under /usr/share/applications/ is claiming -support for this MIME type. If not, please report a bug to have it -fixed. :)

+

Update: The dongle was presented at Debconf last year. You might +find the talk +recording illuminating. It explains exactly what the source of +randomness is, if you are unable to spot it from the schema drawing +available from the ChaosKey web site linked at the start of this blog +post.