X-Git-Url: http://pere.pagekite.me/gitweb/homepage.git/blobdiff_plain/02ee24e09a5d87b64eeb71a46dc6582d9d9ab8c7..761944476e19ec65dc8459f412f7db75ee2a6d57:/blog/index.html diff --git a/blog/index.html b/blog/index.html index 8c0f30e228..ee6378d702 100644 --- a/blog/index.html +++ b/blog/index.html @@ -20,70 +20,63 @@
-
Unlimited randomness with the ChaosKey?
-
1st March 2017
-

A few days ago I ordered a small batch of -the ChaosKey, a small -USB dongle for generating entropy created by Bdale Garbee and Keith -Packard. Yesterday it arrived, and I am very happy to report that it -work great! According to its designers, to get it to work out of the -box, you need the Linux kernel version 4.1 or later. I tested on a -Debian Stretch machine (kernel version 4.9), and there it worked just -fine, increasing the available entropy very quickly. I wrote a small -test oneliner to test. It first print the current entropy level, -drain /dev/random, and then print the entropy level for five seconds. -Here is the situation without the ChaosKey inserted:

- -
-% cat /proc/sys/kernel/random/entropy_avail; \
-  dd bs=1M if=/dev/random of=/dev/null count=1; \
-  for n in $(seq 1 5); do \
-     cat /proc/sys/kernel/random/entropy_avail; \
-     sleep 1; \
-  done
-300
-0+1 oppføringer inn
-0+1 oppføringer ut
-28 byte kopiert, 0,000264565 s, 106 kB/s
-4
-8
-12
-17
-21
-%
-
- -

The entropy level increases by 3-4 every second. In such case any -application requiring random bits (like a HTTPS enabled web server) -will halt and wait for more entrpy. And here is the situation with -the ChaosKey inserted:

- -
-% cat /proc/sys/kernel/random/entropy_avail; \
-  dd bs=1M if=/dev/random of=/dev/null count=1; \
-  for n in $(seq 1 5); do \
-     cat /proc/sys/kernel/random/entropy_avail; \
-     sleep 1; \
-  done
-1079
-0+1 oppføringer inn
-0+1 oppføringer ut
-104 byte kopiert, 0,000487647 s, 213 kB/s
-433
-1028
-1031
-1035
-1038
-%
-
- -

Quite the difference. :) I bought a few more than I need, in case -someone want to buy one here in Norway. :)

+ +
21st March 2018
+

So, Cambridge Analytica is getting some well deserved criticism for +(mis)using information it got from Facebook about 50 million people, +mostly in the USA. What I find a bit surprising, is how little +criticism Facebook is getting for handing the information over to +Cambridge Analytica and others in the first place. And what about the +people handing their private and personal information to Facebook? +And last, but not least, what about the government offices who are +handing information about the visitors of their web pages to Facebook? +No-one who looked at the terms of use of Facebook should be surprised +that information about peoples interests, political views, personal +lifes and whereabouts would be sold by Facebook.

+ +

What I find to be the real scandal is the fact that Facebook is +selling your personal information, not that one of the buyers used it +in a way Facebook did not approve when exposed. It is well known that +Facebook is selling out their users privacy, but a scandal +nevertheless. Of course the information provided to them by Facebook +would be misused by one of the parties given access to personal +information about the millions of Facebook users. Collected +information will be misused sooner or later. The only way to avoid +such misuse, is to not collect the information in the first place. If +you do not want Facebook to hand out information about yourself for +the use and misuse of its customers, do not give Facebook the +information.

+ +

Personally, I would recommend to completely remove your Facebook +account, and take back some control of your personal information. +According +to The Guardian, it is a bit hard to find out how to request +account removal (and not just 'disabling'). You need to +visit +a specific Facebook page and click on 'let us know' on that page +to get to the +real account deletion screen. Perhaps something to consider? I +would not trust the information to really be deleted (who knows, +perhaps NSA, GCHQ and FRA already got a copy), but it might reduce the +exposure a bit.

+ +

If you want to learn more about the capabilities of Cambridge +Analytica, I recommend to see the video recording of the one hour talk +Paul-Olivier Dehaye gave to NUUG last april about + +Data collection, psychometric profiling and their impact on +politics.

+ +

And if you want to communicate with your friends and loved ones, +use some end-to-end encrypted method like +Signal or +Ring, and stop sharing your private +messages with strangers like Facebook and Google.

- Tags: debian, english. + Tags: english, personvern.
@@ -91,34 +84,67 @@ someone want to buy one here in Norway. :)

- -
21st February 2017
-

I just noticed -the -new Norwegian proposal for archiving rules in the goverment list -ECMA-376 -/ ISO/IEC 29500 (aka OOXML) as valid formats to put in long term -storage. Luckily such files will only be accepted based on -pre-approval from the National Archive. Allowing OOXML files to be -used for long term storage might seem like a good idea as long as we -forget that there are plenty of ways for a "valid" OOXML document to -have content with no defined interpretation in the standard, which -lead to a question and an idea.

- -

Is there any tool to detect if a OOXML document depend on such -undefined behaviour? It would be useful for the National Archive (and -anyone else interested in verifying that a document is well defined) -to have such tool available when considering to approve the use of -OOXML. I'm aware of the -officeotron OOXML -validator, but do not know how complete it is nor if it will -report use of undefined behaviour. Are there other similar tools -available? Please send me an email if you know of any such tool.

+ +
14th March 2018
+

I går kom det nok et argument for å holde seg unna det norske +helsevesenet. Da annonserte et stortingsflertall, bestående av Høyre, +Arbeiderpartiet, Fremskrittspartiet og Venstre, at de går inn for å +samle inn og lagre DNA-prøver fra hele befolkningen i Norge til evig +tid. Endringen gjelder innsamlede blodprøver fra nyfødte i Norge. +Det vil dermed ta litt tid før en har hele befolkningen, men det er +dit vi havner gitt nok tid. I dag er det nesten hundre prosent +oppslutning om undersøkelsen som gjøres like etter fødselen, på +bakgrunn av blodprøven det er snakk om å lagre, for å oppdage endel +medfødte sykdommer. Blodprøven lagres i dag i inntil seks år. +Stortingets +flertallsinnstilling er at tidsbegrensingen skal fjernes, og mener +at tidsubegrenset lagring ikke vil påvirke oppslutningen om +undersøkelsen.

+ +

Datatilsynet har ikke akkurat applaudert forslaget:

+ +

+ +

«Datatilsynet mener forslaget ikke i tilstrekkelig grad + synliggjør hvilke etiske og personvernmessige utfordringer som må + diskuteres før en etablerer en nasjonal biobank med blodprøver fra + hele befolkningen.»

+ +

+ +

Det er flere historier om hvordan innsamlet biologisk materiale har +blitt brukt til andre formål enn de ble innsamlet til, og historien om +folkehelseinstituttets +lagring på vegne av politiet (Kripos) av innsamlet biologisk materiale +og DNA-informasjon i strid med loven viser at en ikke kan være +trygg på at lover og intensjoner beskytter de som blir berørt mot +misbruk av slik privat og personlig informasjon.

+ +

Det er verdt å merke seg at det kan forskes på de innsamlede +blodprøvene uten samtykke fra den det gjelder (eller foreldre når det +gjelder barn), etter en lovendring for en stund tilbake, med mindre +det er sendt inn skjema der en reserverer seg mot forskning uten +samtykke. Skjemaet er tilgjengelig fra +folkehelseinstituttets +websider, og jeg anbefaler, uavhengig av denne saken, varmt alle å +sende inn skjemaet for å dokumentere hvor mange som ikke synes det er +greit å fjerne krav om samtykke.

+ +

I tillegg bør en kreve destruering av alt biologisk materiale som +er samlet inn om en selv, for å redusere eventuelle negative +konsekvenser i fremtiden når materialet kommer på avveie eller blir +brukt uten samtykke, men det er så vidt jeg vet ikke noe system for +dette i dag.

+ +

Som vanlig, hvis du bruker Bitcoin og ønsker å vise din støtte til +det jeg driver med, setter jeg pris på om du sender Bitcoin-donasjoner +til min adresse +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -126,33 +152,62 @@ available? Please send me an email if you know of any such tool.

- -
13th February 2017
-

A few days ago, we received the ruling from -my -day in court. The case in question is a challenge of the seizure -of the DNS domain popcorn-time.no. The ruling simply did not mention -most of our arguments, and seemed to take everything ØKOKRIM said at -face value, ignoring our demonstration and explanations. But it is -hard to tell for sure, as we still have not seen most of the documents -in the case and thus were unprepared and unable to contradict several -of the claims made in court by the opposition. We are considering an -appeal, but it is partly a question of funding, as it is costing us -quite a bit to pay for our lawyer. If you want to help, please -donate to the -NUUG defense fund.

- -

The details of the case, as far as we know it, is available in -Norwegian from -the NUUG -blog. This also include -the -ruling itself.

+ +
13th March 2018
+

I am working on publishing yet another book related to Creative +Commons. This time it is a book filled with interviews and histories +from those around the globe making a living using Creative +Commons.

+ +

Yesterday, after many months of hard work by several volunteer +translators, the first draft of a Norwegian Bokmål edition of the book +Made with Creative Commons from 2017 +was complete. The Spanish translation is also complete, while the +Dutch, Polish, German and Ukraine edition need a lot of work. Get in +touch if you want to help make those happen, or would like to +translate into your mother tongue.

+ +

The whole book project started when +Gunnar Wolf announced that he +was going to make a Spanish edition of the book. I noticed, and +offered some input on how to make a book, based on my experience with +translating the +Free +Culture and +The Debian +Administrator's Handbook books to Norwegian Bokmål. To make a +long story short, we ended up working on a Bokmål edition, and now the +first rough translation is complete, thanks to the hard work of +Ole-Erik Yrvin, Ingrid Yrvin, Allan Nordhøy and myself. The first +proof reading is almost done, and only the second and third proof +reading remains. We will also need to translate the 14 figures and +create a book cover. Once it is done we will publish the book on +paper, as well as in PDF, ePub and possibly Mobi formats.

+ +

The book itself originates as a manuscript on Google Docs, is +downloaded as ODT from there and converted to Markdown using pandoc. +The Markdown is modified by a script before is converted to DocBook +using pandoc. The DocBook is modified again using a script before it +is used to create a Gettext POT file for translators. The translated +PO file is then combined with the earlier mentioned DocBook file to +create a translated DocBook file, which finally is given to dblatex to +create the final PDF. The end result is a set of editions of the +manuscript, one English and one for each of the translations.

+ +

The translation is conducted using +the +Weblate web based translation system. Please have a look there +and get in touch if you would like to help out with proof +reading. :)

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -160,86 +215,54 @@ ruling itself.

- -
3rd February 2017
-

- -

On Wednesday, I spent the entire day in court in Follo Tingrett -representing the member association -NUUG, alongside the member -association EFN and the DNS registrar -IMC, challenging the seizure of the DNS name popcorn-time.no. It -was interesting to sit in a court of law for the first time in my -life. Our team can be seen in the picture above: attorney Ola -Tellesbø, EFN board member Tom Fredrik Blenning, IMC CEO Morten Emil -Eriksen and NUUG board member Petter Reinholdtsen.

- -

The -case at hand is that the Norwegian National Authority for -Investigation and Prosecution of Economic and Environmental Crime (aka -Økokrim) decided on their own, to seize a DNS domain early last -year, without following -the -official policy of the Norwegian DNS authority which require a -court decision. The web site in question was a site covering Popcorn -Time. And Popcorn Time is the name of a technology with both legal -and illegal applications. Popcorn Time is a client combining -searching a Bittorrent directory available on the Internet with -downloading/distribute content via Bittorrent and playing the -downloaded content on screen. It can be used illegally if it is used -to distribute content against the will of the right holder, but it can -also be used legally to play a lot of content, for example the -millions of movies -available from the -Internet Archive or the collection -available from Vodo. We created -a -video demonstrating legally use of Popcorn Time and played it in -Court. It can of course be downloaded using Bittorrent.

- -

I did not quite know what to expect from a day in court. The -government held on to their version of the story and we held on to -ours, and I hope the judge is able to make sense of it all. We will -know in two weeks time. Unfortunately I do not have high hopes, as -the Government have the upper hand here with more knowledge about the -case, better training in handling criminal law and in general higher -standing in the courts than fairly unknown DNS registrar and member -associations. It is expensive to be right also in Norway. So far the -case have cost more than NOK 70 000,-. To help fund the case, NUUG -and EFN have asked for donations, and managed to collect around NOK 25 -000,- so far. Given the presentation from the Government, I expect -the government to appeal if the case go our way. And if the case do -not go our way, I hope we have enough funding to appeal.

- -

From the other side came two people from Økokrim. On the benches, -appearing to be part of the group from the government were two people -from the Simonsen Vogt Wiik lawyer office, and three others I am not -quite sure who was. Økokrim had proposed to present two witnesses -from The Motion Picture Association, but this was rejected because -they did not speak Norwegian and it was a bit late to bring in a -translator, but perhaps the two from MPA were present anyway. All -seven appeared to know each other. Good to see the case is take -seriously.

- -

If you, like me, believe the courts should be involved before a DNS -domain is hijacked by the government, or you believe the Popcorn Time -technology have a lot of useful and legal applications, I suggest you -too donate to -the NUUG defense fund. Both Bitcoin and bank transfer are -available. If NUUG get more than we need for the legal action (very -unlikely), the rest will be spend promoting free software, open -standards and unix-like operating systems in Norway, so no matter what -happens the money will be put to good use.

- -

If you want to lean more about the case, I recommend you check out -the blog -posts from NUUG covering the case. They cover the legal arguments -on both sides.

+ +
2nd March 2018
+

Today I was pleasantly surprised to discover my operating system of +choice, Debian, was used in the info screens on the subway stations. +While passing Nydalen subway station in Oslo, Norway, I discovered the +info screen booting with some text scrolling. I was not quick enough +with my camera to be able to record a video of the scrolling boot +screen, but I did get a photo from when the boot got stuck with a +corrupt file system: + +

[photo of subway info screen]

+ +

While I am happy to see Debian used more places, some details of the +content on the screen worries me.

+ +

The image show the version booting is 'Debian GNU/Linux lenny/sid', +indicating that this is based on code taken from Debian Unstable/Sid +after Debian Etch (version 4) was released 2007-04-08 and before +Debian Lenny (version 5) was released 2009-02-14. Since Lenny Debian +has released version 6 (Squeeze) 2011-02-06, 7 (Wheezy) 2013-05-04, 8 +(Jessie) 2015-04-25 and 9 (Stretch) 2017-06-15, according to +a Debian +version history on Wikpedia. This mean the system is running +around 10 year old code, with no security fixes from the vendor for +many years.

+ +

This is not the first time I discover the Oslo subway company, +Ruter, running outdated software. In 2012, +I +discovered the ticket vending machines were running Windows 2000, +and this was +still +the case in 2016. Given the response from the responsible people +in 2016, I would assume the machines are still running unpatched +Windows 2000. Thus, an unpatched Debian setup come as no surprise.

+ +

The photo is made available under the license terms +Creative Commons +4.0 Attribution International (CC BY 4.0).

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -247,41 +270,39 @@ on both sides.

- -
12th January 2017
-

I dag fikk jeg en skikkelig gladmelding. Bakgrunnen er at før jul -arrangerte Nasjonalbiblioteket -et -seminar om sitt knakende gode tiltak «verksregister». Eneste -måten å melde seg på dette seminaret var å sende personopplysninger -til Google via Google Skjemaer. Dette syntes jeg var tvilsom praksis, -da det bør være mulig å delta på seminarer arrangert av det offentlige -uten å måtte dele sine interesser, posisjon og andre -personopplysninger med Google. Jeg ba derfor om innsyn via -Mimes brønn i -avtaler -og vurderinger Nasjonalbiblioteket hadde rundt dette. -Personopplysningsloven legger klare rammer for hva som må være på -plass før en kan be tredjeparter, spesielt i utlandet, behandle -personopplysninger på sine vegne, så det burde eksistere grundig -dokumentasjon før noe slikt kan bli lovlig. To jurister hos -Nasjonalbiblioteket mente først dette var helt i orden, og at Googles -standardavtale kunne brukes som databehandlingsavtale. Det syntes jeg -var merkelig, men har ikke hatt kapasitet til å følge opp saken før -for to dager siden.

- -

Gladnyheten i dag, som kom etter at jeg tipset Nasjonalbiblioteket -om at Datatilsynet underkjente Googles standardavtaler som -databehandleravtaler i 2011, er at Nasjonalbiblioteket har bestemt seg -for å avslutte bruken av Googles Skjemaer/Apps og gå i dialog med DIFI -for å finne bedre måter å håndtere påmeldinger i tråd med -personopplysningsloven. Det er fantastisk å se at av og til hjelper -det å spørre hva i alle dager det offentlige holder på med.

+ +
18th February 2018
+

Surprising as it might sound, there are still computers using the +traditional Sys V init system, and there probably will be until +systemd start working on Hurd and FreeBSD. +The upstream +project still exist, though, and up until today, the upstream +source was available from Savannah via subversion. I am happy to +report that this just changed.

+ +

The upstream source is now in Git, and consist of three +repositories:

+ + + +

I do not really spend much time on the project these days, and I +has mostly retired, but found it best to migrate the source to a good +version control system to help those willing to move it forward.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -289,86 +310,79 @@ det å spørre hva i alle dager det offentlige holder på med.

- -
11th January 2017
-

Jeg leste med interesse en nyhetssak hos -digi.no -og -NRK -om at det ikke bare er meg, men at også NAV bedriver geolokalisering -av IP-adresser, og at det gjøres analyse av IP-adressene til de som -sendes inn meldekort for å se om meldekortet sendes inn fra -utenlandske IP-adresser. Politiadvokat i Drammen, Hans Lyder Haare, -er sitert i NRK på at «De to er jo blant annet avslørt av -IP-adresser. At man ser at meldekortet kommer fra utlandet.»

- -

Jeg synes det er fint at det blir bedre kjent at IP-adresser -knyttes til enkeltpersoner og at innsamlet informasjon brukes til å -stedsbestemme personer også av aktører her i Norge. Jeg ser det som -nok et argument for å bruke -Tor så mye som mulig for å -gjøre gjøre IP-lokalisering vanskeligere, slik at en kan beskytte sin -privatsfære og unngå å dele sin fysiske plassering med -uvedkommede.

- -

Men det er en ting som bekymrer meg rundt denne nyheten. Jeg ble -tipset (takk #nuug) om -NAVs -personvernerklæring, som under punktet «Personvern og statistikk» -lyder:

- -

- -

«Når du besøker nav.no, etterlater du deg elektroniske spor. Sporene -dannes fordi din nettleser automatisk sender en rekke opplysninger til -NAVs tjener (server-maskin) hver gang du ber om å få vist en side. Det -er eksempelvis opplysninger om hvilken nettleser og -versjon du -bruker, og din internettadresse (ip-adresse). For hver side som vises, -lagres følgende opplysninger:

- -
    -
  • hvilken side du ser pÃ¥
  • -
  • dato og tid
  • -
  • hvilken nettleser du bruker
  • -
  • din ip-adresse
  • -
- -

Ingen av opplysningene vil bli brukt til å identifisere -enkeltpersoner. NAV bruker disse opplysningene til å generere en -samlet statistikk som blant annet viser hvilke sider som er mest -populære. Statistikken er et redskap til å forbedre våre -tjenester.»

+ +
14th February 2018
+

A few days ago, a new major version of +VLC was announced, and I +decided to check out if it now supported streaming over +bittorrent and +webtorrent. Bittorrent is one of +the most efficient ways to distribute large files on the Internet, and +Webtorrent is a variant of Bittorrent using +WebRTC as its transport channel, +allowing web pages to stream and share files using the same technique. +The network protocols are similar but not identical, so a client +supporting one of them can not talk to a client supporting the other. +I was a bit surprised with what I discovered when I started to look. +Looking at +the release +notes did not help answering this question, so I started searching +the web. I found several news articles from 2013, most of them +tracing the news from Torrentfreak +("Open +Source Giant VLC Mulls BitTorrent Streaming Support"), about a +initiative to pay someone to create a VLC patch for bittorrent +support. To figure out what happend with this initiative, I headed +over to the #videolan IRC channel and asked if there were some bug or +feature request tickets tracking such feature. I got an answer from +lead developer Jean-Babtiste Kempf, telling me that there was a patch +but neither he nor anyone else knew where it was. So I searched a bit +more, and came across an independent +VLC plugin to add +bittorrent support, created by Johan Gunnarsson in 2016/2017. +Again according to Jean-Babtiste, this is not the patch he was talking +about.

+ +

Anyway, to test the plugin, I made a working Debian package from +the git repository, with some modifications. After installing this +package, I could stream videos from +The Internet Archive using VLC +commands like this:

+ +

+vlc https://archive.org/download/LoveNest/LoveNest_archive.torrent
+

+ +

The plugin is supposed to handle magnet links too, but since The +Internet Archive do not have magnet links and I did not want to spend +time tracking down another source, I have not tested it. It can take +quite a while before the video start playing without any indication of +what is going on from VLC. It took 10-20 seconds when I measured it. +Some times the plugin seem unable to find the correct video file to +play, and show the metadata XML file name in the VLC status line. I +have no idea why.

+ +

I have created a request for +a new package in Debian (RFP) and +asked if +the upstream author is willing to help make this happen. Now we +wait to see what come out of this. I do not want to maintain a +package that is not maintained upstream, nor do I really have time to +maintain more packages myself, so I might leave it at this. But I +really hope someone step up to do the packaging, and hope upstream is +still maintaining the source. If you want to help, please update the +RFP request or the upstream issue.

+ +

I have not found any traces of webtorrent support for VLC.

-

- -

Jeg klarer ikke helt å se hvordan analyse av de besøkendes -IP-adresser for å se hvem som sender inn meldekort via web fra en -IP-adresse i utlandet kan gjøres uten å komme i strid med påstanden om -at «ingen av opplysningene vil bli brukt til å identifisere -enkeltpersoner». Det virker dermed for meg som at NAV bryter sine -egen personvernerklæring, hvilket -Datatilsynet -fortalte meg i starten av desember antagelig er brudd på -personopplysningsloven. - -

I tillegg er personvernerklæringen ganske misvisende i og med at -NAVs nettsider ikke bare forsyner NAV med personopplysninger, men i -tillegg ber brukernes nettleser kontakte fem andre nettjenere -(script.hotjar.com, static.hotjar.com, vars.hotjar.com, -www.google-analytics.com og www.googletagmanager.com), slik at -personopplysninger blir gjort tilgjengelig for selskapene Hotjar og -Google , og alle som kan lytte på trafikken på veien (som FRA, GCHQ og -NSA). Jeg klarer heller ikke se hvordan slikt spredning av -personopplysninger kan være i tråd med kravene i -personopplysningloven, eller i tråd med NAVs personvernerklæring.

- -

Kanskje NAV bør ta en nøye titt på sin personvernerklæring? Eller -kanskje Datatilsynet bør gjøre det?

+

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -376,164 +390,32 @@ kanskje Datatilsynet bør gjøre det?

- -
9th January 2017
-

Did you ever wonder where the web trafic really flow to reach the -web servers, and who own the network equipment it is flowing through? -It is possible to get a glimpse of this from using traceroute, but it -is hard to find all the details. Many years ago, I wrote a system to -map the Norwegian Internet (trying to figure out if our plans for a -network game service would get low enough latency, and who we needed -to talk to about setting up game servers close to the users. Back -then I used traceroute output from many locations (I asked my friends -to run a script and send me their traceroute output) to create the -graph and the map. The output from traceroute typically look like -this: - -

-traceroute to www.stortinget.no (85.88.67.10), 30 hops max, 60 byte packets
- 1  uio-gw10.uio.no (129.240.202.1)  0.447 ms  0.486 ms  0.621 ms
- 2  uio-gw8.uio.no (129.240.24.229)  0.467 ms  0.578 ms  0.675 ms
- 3  oslo-gw1.uninett.no (128.39.65.17)  0.385 ms  0.373 ms  0.358 ms
- 4  te3-1-2.br1.fn3.as2116.net (193.156.90.3)  1.174 ms  1.172 ms  1.153 ms
- 5  he16-1-1.cr1.san110.as2116.net (195.0.244.234)  2.627 ms he16-1-1.cr2.oslosda310.as2116.net (195.0.244.48)  3.172 ms he16-1-1.cr1.san110.as2116.net (195.0.244.234)  2.857 ms
- 6  ae1.ar8.oslosda310.as2116.net (195.0.242.39)  0.662 ms  0.637 ms ae0.ar8.oslosda310.as2116.net (195.0.242.23)  0.622 ms
- 7  89.191.10.146 (89.191.10.146)  0.931 ms  0.917 ms  0.955 ms
- 8  * * *
- 9  * * *
-[...]
-

- -

This show the DNS names and IP addresses of (at least some of the) -network equipment involved in getting the data traffic from me to the -www.stortinget.no server, and how long it took in milliseconds for a -package to reach the equipment and return to me. Three packages are -sent, and some times the packages do not follow the same path. This -is shown for hop 5, where three different IP addresses replied to the -traceroute request.

- -

There are many ways to measure trace routes. Other good traceroute -implementations I use are traceroute (using ICMP packages) mtr (can do -both ICMP, UDP and TCP) and scapy (python library with ICMP, UDP, TCP -traceroute and a lot of other capabilities). All of them are easily -available in Debian.

- -

This time around, I wanted to know the geographic location of -different route points, to visualize how visiting a web page spread -information about the visit to a lot of servers around the globe. The -background is that a web site today often will ask the browser to get -from many servers the parts (for example HTML, JSON, fonts, -JavaScript, CSS, video) required to display the content. This will -leak information about the visit to those controlling these servers -and anyone able to peek at the data traffic passing by (like your ISP, -the ISPs backbone provider, FRA, GCHQ, NSA and others).

- -

Lets pick an example, the Norwegian parliament web site -www.stortinget.no. It is read daily by all members of parliament and -their staff, as well as political journalists, activits and many other -citizens of Norway. A visit to the www.stortinget.no web site will -ask your browser to contact 8 other servers: ajax.googleapis.com, -insights.hotjar.com, script.hotjar.com, static.hotjar.com, -stats.g.doubleclick.net, www.google-analytics.com, -www.googletagmanager.com and www.netigate.se. I extracted this by -asking PhantomJS to visit the -Stortinget web page and tell me all the URLs PhantomJS downloaded to -render the page (in HAR format using -their -netsniff example. I am very grateful to Gorm for showing me how -to do this). My goal is to visualize network traces to all IP -addresses behind these DNS names, do show where visitors personal -information is spread when visiting the page.

- -

map of combined traces for URLs used by www.stortinget.no using GeoIP

- -

When I had a look around for options, I could not find any good -free software tools to do this, and decided I needed my own traceroute -wrapper outputting KML based on locations looked up using GeoIP. KML -is easy to work with and easy to generate, and understood by several -of the GIS tools I have available. I got good help from by NUUG -colleague Anders Einar with this, and the result can be seen in -my -kmltraceroute git repository. Unfortunately, the quality of the -free GeoIP databases I could find (and the for-pay databases my -friends had access to) is not up to the task. The IP addresses of -central Internet infrastructure would typically be placed near the -controlling companies main office, and not where the router is really -located, as you can see from the -KML file I created using the GeoLite City dataset from MaxMind. - -

scapy traceroute graph for URLs used by www.stortinget.no

- -

I also had a look at the visual traceroute graph created by -the scrapy project, -showing IP network ownership (aka AS owner) for the IP address in -question. -The -graph display a lot of useful information about the traceroute in SVG -format, and give a good indication on who control the network -equipment involved, but it do not include geolocation. This graph -make it possible to see the information is made available at least for -UNINETT, Catchcom, Stortinget, Nordunet, Google, Amazon, Telia, Level -3 Communications and NetDNA.

- -

example geotraceroute view for www.stortinget.no

- -

In the process, I came across the -web service GeoTraceroute by -Salim Gasmi. Its methology of combining guesses based on DNS names, -various location databases and finally use latecy times to rule out -candidate locations seemed to do a very good job of guessing correct -geolocation. But it could only do one trace at the time, did not have -a sensor in Norway and did not make the geolocations easily available -for postprocessing. So I contacted the developer and asked if he -would be willing to share the code (he refused until he had time to -clean it up), but he was interested in providing the geolocations in a -machine readable format, and willing to set up a sensor in Norway. So -since yesterday, it is possible to run traces from Norway in this -service thanks to a sensor node set up by -the NUUG assosiation, and get the -trace in KML format for further processing.

- -

map of combined traces for URLs used by www.stortinget.no using geotraceroute

- -

Here we can see a lot of trafic passes Sweden on its way to -Denmark, Germany, Holland and Ireland. Plenty of places where the -Snowden confirmations verified the traffic is read by various actors -without your best interest as their top priority.

- -

Combining KML files is trivial using a text editor, so I could loop -over all the hosts behind the urls imported by www.stortinget.no and -ask for the KML file from GeoTraceroute, and create a combined KML -file with all the traces (unfortunately only one of the IP addresses -behind the DNS name is traced this time. To get them all, one would -have to request traces using IP number instead of DNS names from -GeoTraceroute). That might be the next step in this project.

- -

Armed with these tools, I find it a lot easier to figure out where -the IP traffic moves and who control the boxes involved in moving it. -And every time the link crosses for example the Swedish border, we can -be sure Swedish Signal Intelligence (FRA) is listening, as GCHQ do in -Britain and NSA in USA and cables around the globe. (Hm, what should -we tell them? :) Keep that in mind if you ever send anything -unencrypted over the Internet.

- -

PS: KML files are drawn using -the KML viewer from Ivan -Rublev, as it was less cluttered than the local Linux application -Marble. There are heaps of other options too.

+ +
13th February 2018
+

A new version of the +3D printer slicer +software Cura, version 3.1.0, is now available in Debian Testing +(aka Buster) and Debian Unstable (aka Sid). I hope you find it +useful. It was uploaded the last few days, and the last update will +enter testing tomorrow. See the +release +notes for the list of bug fixes and new features. Version 3.2 +was announced 6 days ago. We will try to get it into Debian as +well.

+ +

More information related to 3D printing is available on the +3D printing and +3D printer wiki pages +in Debian.

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address -15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -541,77 +423,116 @@ activities, please send Bitcoin donations to my address
- -
4th January 2017
-

Do you have a large iCalendar -file with lots of old entries, and would like to archive them to save -space and resources? At least those of us using KOrganizer know that -turning on and off an event set become slower and slower the more -entries are in the set. While working on migrating our calendars to a -Radicale CalDAV server on our -Freedombox server, my -loved one wondered if I could find a way to split up the calendar file -she had in KOrganizer, and I set out to write a tool. I spent a few -days writing and polishing the system, and it is now ready for general -consumption. The -code for -ical-archiver is publicly available from a git repository on -github. The system is written in Python and depend on -the vobject Python -module.

- -

To use it, locate the iCalendar file you want to operate on and -give it as an argument to the ical-archiver script. This will -generate a set of new files, one file per component type per year for -all components expiring more than two years in the past. The vevent, -vtodo and vjournal entries are handled by the script. The remaining -entries are stored in a 'remaining' file.

- -

This is what a test run can look like: +

+
12th February 2018
+

Jeg lar meg fascinere av en artikkel +i +Dagbladet om Kinas håndtering av Xinjiang, spesielt følgende +utsnitt:

-

-% ical-archiver t/2004-2016.ics 
-Found 3612 vevents
-Found 6 vtodos
-Found 2 vjournals
-Writing t/2004-2016.ics-subset-vevent-2004.ics
-Writing t/2004-2016.ics-subset-vevent-2005.ics
-Writing t/2004-2016.ics-subset-vevent-2006.ics
-Writing t/2004-2016.ics-subset-vevent-2007.ics
-Writing t/2004-2016.ics-subset-vevent-2008.ics
-Writing t/2004-2016.ics-subset-vevent-2009.ics
-Writing t/2004-2016.ics-subset-vevent-2010.ics
-Writing t/2004-2016.ics-subset-vevent-2011.ics
-Writing t/2004-2016.ics-subset-vevent-2012.ics
-Writing t/2004-2016.ics-subset-vevent-2013.ics
-Writing t/2004-2016.ics-subset-vevent-2014.ics
-Writing t/2004-2016.ics-subset-vjournal-2007.ics
-Writing t/2004-2016.ics-subset-vjournal-2011.ics
-Writing t/2004-2016.ics-subset-vtodo-2012.ics
-Writing t/2004-2016.ics-remaining.ics
-%
-

+

-

As you can see, the original file is untouched and new files are -written with names derived from the original file. If you are happy -with their content, the *-remaining.ics file can replace the original -the the others can be archived or imported as historical calendar -collections.

+

«I den sørvestlige byen Kashgar nærmere grensa til +Sentral-Asia meldes det nå at 120.000 uigurer er internert i såkalte +omskoleringsleirer. Samtidig er det innført et omfattende +helsesjekk-program med innsamling og lagring av DNA-prøver fra +absolutt alle innbyggerne. De mest avanserte overvåkingsmetodene +testes ut her. Programmer for å gjenkjenne ansikter og stemmer er på +plass i regionen. Der har de lokale myndighetene begynt å installere +GPS-systemer i alle kjøretøy og egne sporingsapper i +mobiltelefoner.

-

The script should probably be improved a bit. The error handling -when discovering broken entries is not good, and I am not sure yet if -it make sense to split different entry types into separate files or -not. The program is thus likely to change. If you find it -interesting, please get in touch. :)

+

Politimetodene griper så dypt inn i folks dagligliv at motstanden +mot Beijing-regimet øker.»

-

As usual, if you use Bitcoin and want to show your support of my -activities, please send Bitcoin donations to my address -15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

+

+ +

Beskrivelsen avviker jo desverre ikke så veldig mye fra tilstanden +her i Norge.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
DataregistreringKinaNorge
Innsamling og lagring av DNA-prøver fra befolkningenJaDelvis, planlagt for alle nyfødte.
AnsiktsgjenkjenningJaJa
StemmegjenkjenningJaNei
Posisjons-sporing av mobiltelefonerJaJa
Posisjons-sporing av bilerJaJa
+ +

I Norge har jo situasjonen rundt Folkehelseinstituttets lagring av +DNA-informasjon på vegne av politiet, der de nektet å slette +informasjon politiet ikke hadde lov til å ta vare på, gjort det klart +at DNA tar vare på ganske lenge. I tillegg finnes det utallige +biobanker som lagres til evig tid, og det er planer om å innføre +evig +lagring av DNA-materiale fra alle spebarn som fødes (med mulighet +for å be om sletting).

+ +

I Norge er det system på plass for ansiktsgjenkjenning, som +en +NRK-artikkel fra 2015 forteller er aktiv på Gardermoen, samt +brukes +til å analysere bilder innsamlet av myndighetene. Brukes det også +flere plasser? Det er tett med overvåkningskamera kontrollert av +politi og andre myndigheter i for eksempel Oslo sentrum.

+ +

Jeg er ikke kjent med at Norge har noe system for identifisering av +personer ved hjelp av stemmegjenkjenning.

+ +

Posisjons-sporing av mobiltelefoner er ruinemessig tilgjengelig for +blant annet politi, NAV og Finanstilsynet, i tråd med krav i +telefonselskapenes konsesjon. I tillegg rapporterer smarttelefoner +sin posisjon til utviklerne av utallige mobil-apper, der myndigheter +og andre kan hente ut informasjon ved behov. Det er intet behov for +noen egen app for dette.

+ +

Posisjons-sporing av biler er rutinemessig tilgjengelig via et tett +nett av målepunkter på veiene (automatiske bomstasjoner, +køfribrikke-registrering, automatiske fartsmålere og andre veikamera). +Det er i tillegg vedtatt at alle nye biler skal selges med utstyr for +GPS-sporing (eCall).

+ +

Det er jammen godt vi lever i et liberalt demokrati, og ikke en +overvåkningsstat, eller?

+ +

Som vanlig, hvis du bruker Bitcoin og ønsker å vise din støtte til +det jeg driver med, setter jeg pris på om du sender Bitcoin-donasjoner +til min adresse +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

- Tags: english, standard. + Tags: norsk, surveillance.
@@ -619,96 +540,31 @@ activities, please send Bitcoin donations to my address
- -
23rd December 2016
-

I received a very nice Christmas present today. As my regular -readers probably know, I have been working on the -the Isenkram -system for many years. The goal of the Isenkram system is to make -it easier for users to figure out what to install to get a given piece -of hardware to work in Debian, and a key part of this system is a way -to map hardware to packages. Isenkram have its own mapping database, -and also uses data provided by each package using the AppStream -metadata format. And today, -AppStream in -Debian learned to look up hardware the same way Isenkram is doing it, -ie using fnmatch():

- -

-% appstreamcli what-provides modalias \
-  usb:v1130p0202d0100dc00dsc00dp00ic03isc00ip00in00
-Identifier: pymissile [generic]
-Name: pymissile
-Summary: Control original Striker USB Missile Launcher
-Package: pymissile
-% appstreamcli what-provides modalias usb:v0694p0002d0000
-Identifier: libnxt [generic]
-Name: libnxt
-Summary: utility library for talking to the LEGO Mindstorms NXT brick
-Package: libnxt
----
-Identifier: t2n [generic]
-Name: t2n
-Summary: Simple command-line tool for Lego NXT
-Package: t2n
----
-Identifier: python-nxt [generic]
-Name: python-nxt
-Summary: Python driver/interface/wrapper for the Lego Mindstorms NXT robot
-Package: python-nxt
----
-Identifier: nbc [generic]
-Name: nbc
-Summary: C compiler for LEGO Mindstorms NXT bricks
-Package: nbc
-%
-

+ +
11th February 2018
+
-

A similar query can be done using the combined AppStream and -Isenkram databases using the isenkram-lookup tool:

+

We write 2018, and it is 30 years since Unicode was introduced. +Most of us in Norway have come to expect the use of our alphabet to +just work with any computer system. But it is apparently beyond reach +of the computers printing recites at a restaurant. Recently I visited +a Peppes pizza resturant, and noticed a few details on the recite. +Notice how 'ø' and 'å' are replaced with strange symbols in +'Servitør', 'Å BETALE', 'Beløp pr. gjest', 'Takk for besøket.' and 'Vi +gleder oss til å se deg igjen'.

-

-% isenkram-lookup usb:v1130p0202d0100dc00dsc00dp00ic03isc00ip00in00
-pymissile
-% isenkram-lookup usb:v0694p0002d0000
-libnxt
-nbc
-python-nxt
-t2n
-%
-

+

I would say that this state is passed sad and over in embarrassing.

-

You can find modalias values relevant for your machine using -cat $(find /sys/devices/ -name modalias). - -

If you want to make this system a success and help Debian users -make the most of the hardware they have, please -helpadd -AppStream metadata for your package following the guidelines -documented in the wiki. So far only 11 packages provide such -information, among the several hundred hardware specific packages in -Debian. The Isenkram database on the other hand contain 101 packages, -mostly related to USB dongles. Most of the packages with hardware -mapping in AppStream are LEGO Mindstorms related, because I have, as -part of my involvement in -the Debian LEGO -team given priority to making sure LEGO users get proposed the -complete set of packages in Debian for that particular hardware. The -team also got a nice Christmas present today. The -nxt-firmware -package made it into Debian. With this package in place, it is -now possible to use the LEGO Mindstorms NXT unit with only free -software, as the nxt-firmware package contain the source and firmware -binaries for the NXT brick.

+

I removed personal and private information to be nice.

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address -15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

+15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

- Tags: debian, english, isenkram. + Tags: english.
@@ -716,106 +572,81 @@ activities, please send Bitcoin donations to my address
- -
20th December 2016
-

The Isenkram -system I wrote two years ago to make it easier in Debian to find -and install packages to get your hardware dongles to work, is still -going strong. It is a system to look up the hardware present on or -connected to the current system, and map the hardware to Debian -packages. It can either be done using the tools in isenkram-cli or -using the user space daemon in the isenkram package. The latter will -notify you, when inserting new hardware, about what packages to -install to get the dongle working. It will even provide a button to -click on to ask packagekit to install the packages.

- -

Here is an command line example from my Thinkpad laptop:

+ +
7th January 2018
+

I've continued to track down list of movies that are legal to +distribute on the Internet, and identified more than 11,000 title IDs +in The Internet Movie Database (IMDB) so far. Most of them (57%) are +feature films from USA published before 1923. I've also tracked down +more than 24,000 movies I have not yet been able to map to IMDB title +ID, so the real number could be a lot higher. According to the front +web page for Retro Film +Vault, there are 44,000 public domain films, so I guess there are +still some left to identify.

+ +

The complete data set is available from +a +public git repository, including the scripts used to create it. +Most of the data is collected using web scraping, for example from the +"product catalog" of companies selling copies of public domain movies, +but any source I find believable is used. I've so far had to throw +out three sources because I did not trust the public domain status of +the movies listed.

+ +

Anyway, this is the summary of the 28 collected data sources so +far:

-% isenkram-lookup  
-bluez
-cheese
-ethtool
-fprintd
-fprintd-demo
-gkrellm-thinkbat
-hdapsd
-libpam-fprintd
-pidgin-blinklight
-thinkfan
-tlp
-tp-smapi-dkms
-tp-smapi-source
-tpb
-%
+ 2352 entries (   66 unique) with and 15983 without IMDB title ID in free-movies-archive-org-search.json
+ 2302 entries (  120 unique) with and     0 without IMDB title ID in free-movies-archive-org-wikidata.json
+  195 entries (   63 unique) with and   200 without IMDB title ID in free-movies-cinemovies.json
+   89 entries (   52 unique) with and    38 without IMDB title ID in free-movies-creative-commons.json
+  344 entries (   28 unique) with and   655 without IMDB title ID in free-movies-fesfilm.json
+  668 entries (  209 unique) with and  1064 without IMDB title ID in free-movies-filmchest-com.json
+  830 entries (   21 unique) with and     0 without IMDB title ID in free-movies-icheckmovies-archive-mochard.json
+   19 entries (   19 unique) with and     0 without IMDB title ID in free-movies-imdb-c-expired-gb.json
+ 6822 entries ( 6669 unique) with and     0 without IMDB title ID in free-movies-imdb-c-expired-us.json
+  137 entries (    0 unique) with and     0 without IMDB title ID in free-movies-imdb-externlist.json
+ 1205 entries (   57 unique) with and     0 without IMDB title ID in free-movies-imdb-pd.json
+   84 entries (   20 unique) with and   167 without IMDB title ID in free-movies-infodigi-pd.json
+  158 entries (  135 unique) with and     0 without IMDB title ID in free-movies-letterboxd-looney-tunes.json
+  113 entries (    4 unique) with and     0 without IMDB title ID in free-movies-letterboxd-pd.json
+  182 entries (  100 unique) with and     0 without IMDB title ID in free-movies-letterboxd-silent.json
+  229 entries (   87 unique) with and     1 without IMDB title ID in free-movies-manual.json
+   44 entries (    2 unique) with and    64 without IMDB title ID in free-movies-openflix.json
+  291 entries (   33 unique) with and   474 without IMDB title ID in free-movies-profilms-pd.json
+  211 entries (    7 unique) with and     0 without IMDB title ID in free-movies-publicdomainmovies-info.json
+ 1232 entries (   57 unique) with and  1875 without IMDB title ID in free-movies-publicdomainmovies-net.json
+   46 entries (   13 unique) with and    81 without IMDB title ID in free-movies-publicdomainreview.json
+  698 entries (   64 unique) with and   118 without IMDB title ID in free-movies-publicdomaintorrents.json
+ 1758 entries (  882 unique) with and  3786 without IMDB title ID in free-movies-retrofilmvault.json
+   16 entries (    0 unique) with and     0 without IMDB title ID in free-movies-thehillproductions.json
+   63 entries (   16 unique) with and   141 without IMDB title ID in free-movies-vodo.json
+11583 unique IMDB title IDs in total, 8724 only in one list, 24647 without IMDB title ID
 

-

It can also list the firware package providing firmware requested -by the load kernel modules, which in my case is an empty list because -I have all the firmware my machine need: +

I keep finding more data sources. I found the cinemovies source +just a few days ago, and as you can see from the summary, it extended +my list with 63 movies. Check out the mklist-* scripts in the git +repository if you are curious how the lists are created. Many of the +titles are extracted using searches on IMDB, where I look for the +title and year, and accept search results with only one movie listed +if the year matches. This allow me to automatically use many lists of +movies without IMDB title ID references at the cost of increasing the +risk of wrongly identify a IMDB title ID as public domain. So far my +random manual checks have indicated that the method is solid, but I +really wish all lists of public domain movies would include unique +movie identifier like the IMDB title ID. It would make the job of +counting movies in the public domain a lot easier.

-

-% /usr/sbin/isenkram-autoinstall-firmware -l
-info: did not find any firmware files requested by loaded kernel modules.  exiting
-%
-

- -

The last few days I had a look at several of the around 250 -packages in Debian with udev rules. These seem like good candidates -to install when a given hardware dongle is inserted, and I found -several that should be proposed by isenkram. I have not had time to -check all of them, but am happy to report that now there are 97 -packages packages mapped to hardware by Isenkram. 11 of these -packages provide hardware mapping using AppStream, while the rest are -listed in the modaliases file provided in isenkram.

- -

These are the packages with hardware mappings at the moment. The -marked packages are also announcing their hardware -support using AppStream, for everyone to use:

- -

air-quality-sensor, alsa-firmware-loaders, argyll, -array-info, avarice, avrdude, b43-fwcutter, -bit-babbler, bluez, bluez-firmware, brltty, -broadcom-sta-dkms, calibre, cgminer, cheese, colord, -colorhug-client, dahdi-firmware-nonfree, dahdi-linux, -dfu-util, dolphin-emu, ekeyd, ethtool, firmware-ipw2x00, fprintd, -fprintd-demo, galileo, gkrellm-thinkbat, gphoto2, -gpsbabel, gpsbabel-gui, gpsman, gpstrans, gqrx-sdr, gr-fcdproplus, -gr-osmosdr, gtkpod, hackrf, hdapsd, hdmi2usb-udev, hpijs-ppds, hplip, -ipw3945-source, ipw3945d, kde-config-tablet, kinect-audio-setup, -libnxt, libpam-fprintd, lomoco, -madwimax, minidisc-utils, mkgmap, msi-keyboard, mtkbabel, -nbc, nqc, nut-hal-drivers, ola, -open-vm-toolbox, open-vm-tools, openambit, pcgminer, pcmciautils, -pcscd, pidgin-blinklight, printer-driver-splix, -pymissile, python-nxt, qlandkartegt, -qlandkartegt-garmin, rosegarden, rt2x00-source, sispmctl, -soapysdr-module-hackrf, solaar, squeak-plugins-scratch, sunxi-tools, -t2n, thinkfan, thinkfinger-tools, tlp, tp-smapi-dkms, -tp-smapi-source, tpb, tucnak, uhd-host, usbmuxd, viking, -virtualbox-ose-guest-x11, w1retap, xawtv, xserver-xorg-input-vmmouse, -xserver-xorg-input-wacom, xserver-xorg-video-qxl, -xserver-xorg-video-vmware, yubikey-personalization and -zd1211-firmware

- -

If you know of other packages, please let me know with a wishlist -bug report against the isenkram-cli package, and ask the package -maintainer to -add AppStream -metadata according to the guidelines to provide the information -for everyone. In time, I hope to get rid of the isenkram specific -hardware mapping and depend exclusively on AppStream.

- -

Note, the AppStream metadata for broadcom-sta-dkms is matching too -much hardware, and suggest that the package with with any ethernet -card. See bug #838735 for -the details. I hope the maintainer find time to address it soon. In -the mean time I provide an override in isenkram.

+

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -830,6 +661,17 @@ the mean time I provide an override in isenkram.

Archive