X-Git-Url: http://pere.pagekite.me/gitweb/homepage.git/blobdiff_plain/ac8c47a23c0bfb4fea0f160f02413eefa5ee5812..eb973c52f2e5a325c7ca6df9b9ff95d4f0dc3100:/blog/index.html diff --git a/blog/index.html b/blog/index.html index 026824a305..caec1056c6 100644 --- a/blog/index.html +++ b/blog/index.html @@ -20,50 +20,19 @@
-
«Rapporten ser ikke på informasjonssikkerhet knyttet til personlig integritet»
-
27th June 2017
-

Jeg kom over teksten -«Killing -car privacy by federal mandate» av Leonid Reyzin på Freedom to -Tinker i dag, og det gleder meg å se en god gjennomgang om hvorfor det -er et urimelig inngrep i privatsfæren å la alle biler kringkaste sin -posisjon og bevegelse via radio. Det omtalte forslaget basert på -Dedicated Short Range Communication (DSRC) kalles Basic Safety Message -(BSM) i USA og Cooperative Awareness Message (CAM) i Europa, og det -norske Vegvesenet er en av de som ser ut til å kunne tenke seg å -pålegge alle biler å fjerne nok en bit av innbyggernes privatsfære. -Anbefaler alle å lese det som står der. - -

Mens jeg tittet litt på DSRC på biler i Norge kom jeg over et sitat -jeg synes er illustrativt for hvordan det offentlige Norge håndterer -problemstillinger rundt innbyggernes privatsfære i SINTEF-rapporten -«Informasjonssikkerhet -i AutoPASS-brikker» av Trond Foss:

- -

-«Rapporten ser ikke på informasjonssikkerhet knyttet til personlig - integritet.» -

- -

Så enkelt kan det tydeligvis gjøres når en vurderer -informasjonssikkerheten. Det holder vel at folkene på toppen kan si -at «Personvernet er ivaretatt», som jo er den populære intetsigende -frasen som gjør at mange tror enkeltindividers integritet tas vare på. -Sitatet fikk meg til å undres på hvor ofte samme tilnærming, å bare se -bort fra behovet for personlig itegritet, blir valgt når en velger å -legge til rette for nok et inngrep i privatsfæren til personer i -Norge. Det er jo sjelden det får reaksjoner. Historien om -reaksjonene på Helse Sør-Østs tjenesteutsetting er jo sørgelig nok et -unntak og toppen av isfjellet, desverre. Tror jeg fortsatt takker nei -til både AutoPASS og holder meg så langt unna det norske helsevesenet -som jeg kan, inntil de har demonstrert og dokumentert at de verdsetter -individets privatsfære og personlige integritet høyere enn kortsiktig -gevist og samfunnsnytte.

+ +
30th August 2018
+

It might seem obvious that software created using tax money should +be available for everyone to use and improve. Free Software +Foundation Europe recentlystarted a campaign to help get more people +to understand this, and I just signed the petition on +Public Money, Public Code to help +them. I hope you too will do the same.

@@ -71,66 +40,71 @@ gevist og samfunnsnytte.

- -
12th June 2017
-

It is pleasing to see that the work we put down in publishing new -editions of the classic Free -Culture book by the founder of the Creative Commons movement, -Lawrence Lessig, is still being appreciated. I had a look at the -latest sales numbers for the paper edition today. Not too impressive, -but happy to see some buyers still exist. All the revenue from the -books is sent to the Creative -Commons Corporation, and they receive the largest cut if you buy -directly from Lulu. Most books are sold via Amazon, with Ingram -second and only a small fraction directly from Lulu. The ebook -edition is available for free from -Github.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Title / languageQuantity
2016 jan-jun2016 jul-dec2017 jan-may
Culture Libre / French3615
Fri kultur / Norwegian710
Free Culture / English142716
Total243431
- -

A bit sad to see the low sales number on the Norwegian edition, and -a bit surprising the English edition still selling so well.

- -

If you would like to translate and publish the book in your native -language, I would be happy to help make it happen. Please get in -touch.

+ +
13th August 2018
+

A few days ago, I wondered if there are any privacy respecting +health monitors and/or fitness trackers available for sale these days. +I would like to buy one, but do not want to share my personal data +with strangers, nor be forced to have a mobile phone to get data out +of the unit. I've received some ideas, and would like to share them +with you. + +One interesting data point was a pointer to a Free Software app for +Android named +Gadgetbridge. +It provide cloudless collection and storing of data from a variety of +trackers. Its +list +of supported devices is a good indicator for units where the +protocol is fairly open, as it is obviously being handled by Free +Software. Other units are reportedly encrypting the collected +information with their own public key, making sure only the vendor +cloud service is able to extract data from the unit. The people +contacting me about Gadgetbirde said they were using +Amazfit +Bip and +Xiaomi +Band 3.

+ +

I also got a suggestion to look at some of the units from Garmin. +I was told their GPS watches can be connected via USB and show up as a +USB storage device with +Garmin +FIT files containing the collected measurements. While +proprietary, FIT files apparently can be read at least by +GPSBabel and the +GpxPod Nextcloud +app. It is unclear to me if they can read step count and heart rate +data. The person I talked to was using a +Garmin Forerunner +935, which is a fairly expensive unit. I doubt it is worth it for +a unit where the vendor clearly is trying its best to move from open +to closed systems. I still remember when Garmin dropped NMEA support +in its GPSes.

+ +

A final idea was to build ones own unit, perhaps by basing it on a +wearable hardware platforms like +the Flora Geo +Watch. Sound like fun, but I had more money than time to spend on +the topic, so I suspect it will have to wait for another time.

+ +

While I was working on tracking down links, I came across an +inspiring TED talk by Dave Debronkart about +being a +e-patient, and discovered the web site +Participatory +Medicine. If you too want to track your own health and fitness +without having information about your private life floating around on +computers owned by others, I recommend checking it out.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

- Tags: docbook, english, freeculture. + Tags: english.
@@ -138,59 +112,36 @@ touch.

- -
10th June 2017
-

I am very happy to report that the -Nikita Noark 5 -core project tagged its second release today. The free software -solution is an implementation of the Norwegian archive standard Noark -5 used by government offices in Norway. These were the changes in -version 0.1.1 since version 0.1.0 (from NEWS.md): - -

    - -
  • Continued work on the angularjs GUI, including document upload.
  • -
  • Implemented correspondencepartPerson, correspondencepartUnit and - correspondencepartInternal
  • -
  • Applied for coverity coverage and started submitting code on - regualr basis.
  • -
  • Started fixing bugs reported by coverity
  • -
  • Corrected and completed HATEOAS links to make sure entire API is - available via URLs in _links.
  • -
  • Corrected all relation URLs to use trailing slash.
  • -
  • Add initial support for storing data in ElasticSearch.
  • -
  • Now able to receive and store uploaded files in the archive.
  • -
  • Changed JSON output for object lists to have relations in _links.
  • -
  • Improve JSON output for empty object lists.
  • -
  • Now uses correct MIME type application/vnd.noark5-v4+json.
  • -
  • Added support for docker container images.
  • -
  • Added simple API browser implemented in JavaScript/Angular.
  • -
  • Started on archive client implemented in JavaScript/Angular.
  • -
  • Started on prototype to show the public mail journal.
  • -
  • Improved performance by disabling Sprint FileWatcher.
  • -
  • Added support for 'arkivskaper', 'saksmappe' and 'journalpost'.
  • -
  • Added support for some metadata codelists.
  • -
  • Added support for Cross-origin resource sharing (CORS).
  • -
  • Changed login method from Basic Auth to JSON Web Token (RFC 7519) - style.
  • -
  • Added support for GET-ing ny-* URLs.
  • -
  • Added support for modifying entities using PUT and eTag.
  • -
  • Added support for returning XML output on request.
  • -
  • Removed support for English field and class names, limiting ourself - to the official names.
  • -
  • ...
  • - -
- -

If this sound interesting to you, please contact us on IRC (#nikita -on irc.freenode.net) or email -(nikita-noark -mailing list).

+ +
7th August 2018
+

Dear lazyweb,

+ +

I wonder, is there a fitness tracker / health monitor available for +sale today that respect the users privacy? With this I mean a +watch/bracelet capable of measuring pulse rate and other +fitness/health related values (and by all means, also the correct time +and location if possible), which is only provided for +me to extract/read from the unit with computer without a radio beacon +and Internet connection. In other words, it do not depend on a cell +phone app, and do make the measurements available via other peoples +computer (aka "the cloud"). The collected data should be available +using only free software. I'm not interested in depending on some +non-free software that will leave me high and dry some time in the +future. I've been unable to find any such unit. I would like to buy +it. The ones I have seen for sale here in Norway are proud to report +that they share my health data with strangers (aka "cloud enabled"). +Is there an alternative? I'm not interested in giving money to people +requiring me to accept "privacy terms" to allow myself to measure my +own health.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -198,99 +149,75 @@ mailing list).

- -
7th June 2017
-

This is a copy of -an -email I posted to the nikita-noark mailing list. Please follow up -there if you would like to discuss this topic. The background is that -we are making a free software archive system based on the Norwegian -Noark -5 standard for government archives.

- -

I've been wondering a bit lately how trusted timestamps could be -stored in Noark 5. -Trusted -timestamps can be used to verify that some information -(document/file/checksum/metadata) have not been changed since a -specific time in the past. This is useful to verify the integrity of -the documents in the archive.

- -

Then it occured to me, perhaps the trusted timestamps could be -stored as dokument variants (ie dokumentobjekt referered to from -dokumentbeskrivelse) with the filename set to the hash it is -stamping?

- -

Given a "dokumentbeskrivelse" with an associated "dokumentobjekt", -a new dokumentobjekt is associated with "dokumentbeskrivelse" with the -same attributes as the stamped dokumentobjekt except these -attributes:

- -
    - -
  • format -> "RFC3161" -
  • mimeType -> "application/timestamp-reply" -
  • formatDetaljer -> "<source URL for timestamp service>" -
  • filenavn -> "<sjekksum>.tsr" - -
- -

This assume a service following -IETF RFC 3161 is -used, which specifiy the given MIME type for replies and the .tsr file -ending for the content of such trusted timestamp. As far as I can -tell from the Noark 5 specifications, it is OK to have several -variants/renderings of a dokument attached to a given -dokumentbeskrivelse objekt. It might be stretching it a bit to make -some of these variants represent crypto-signatures useful for -verifying the document integrity instead of representing the dokument -itself.

- -

Using the source of the service in formatDetaljer allow several -timestamping services to be used. This is useful to spread the risk -of key compromise over several organisations. It would only be a -problem to trust the timestamps if all of the organisations are -compromised.

- -

The following oneliner on Linux can be used to generate the tsr -file. $input is the path to the file to checksum, and $sha256 is the -SHA-256 checksum of the file (ie the ".tsr" value mentioned -above).

+ +
31st July 2018
+

For a while now, I have looked for a sensible way to share images +with my family using a self hosted solution, as it is unacceptable to +place images from my personal life under the control of strangers +working for data hoarders like Google or Dropbox. The last few days I +have drafted an approach that might work out, and I would like to +share it with you. I would like to publish images on a server under +my control, and point some Internet connected display units using some +free and open standard to the images I published. As my primary +language is not limited to ASCII, I need to store metadata using +UTF-8. Many years ago, I hoped to find a digital photo frame capable +of reading a RSS feed with image references (aka using the +<enclosure> RSS tag), but was unable to find a current supplier +of such frames. In the end I gave up that approach.

+ +

Some months ago, I discovered that +XScreensaver is able to +read images from a RSS feed, and used it to set up a screen saver on +my home info screen, showing images from the Daily images feed from +NASA. This proved to work well. More recently I discovered that +Kodi (both using +OpenELEC and +LibreELEC) provide the +Feedreader +screen saver capable of reading a RSS feed with images and news. For +fun, I used it this summer to test Kodi on my parents TV by hooking up +a Raspberry PI unit with LibreELEC, and wanted to provide them with a +screen saver showing selected pictures from my selection.

+ +

Armed with motivation and a test photo frame, I set out to generate +a RSS feed for the Kodi instance. I adjusted my Freedombox instance, created +/var/www/html/privatepictures/, wrote a small Perl script to extract +title and description metadata from the photo files and generate the +RSS file. I ended up using Perl instead of python, as the +libimage-exiftool-perl Debian package seemed to handle the EXIF/XMP +tags I ended up using, while python3-exif did not. The relevant EXIF +tags only support ASCII, so I had to find better alternatives. XMP +seem to have the support I need.

+ +

I am a bit unsure which EXIF/XMP tags to use, as I would like to +use tags that can be easily added/updated using normal free software +photo managing software. I ended up using the tags set using this +exiftool command, as these tags can also be set using digiKam:

-

-openssl ts -query -data "$inputfile" -cert -sha256 -no_nonce \
-  | curl -s -H "Content-Type: application/timestamp-query" \
-      --data-binary "@-" http://zeitstempel.dfn.de > $sha256.tsr
-

- -

To verify the timestamp, you first need to download the public key -of the trusted timestamp service, for example using this command:

- -

-wget -O ca-cert.txt \
-  https://pki.pca.dfn.de/global-services-ca/pub/cacert/chain.txt
-

- -

Note, the public key should be stored alongside the timestamps in -the archive to make sure it is also available 100 years from now. It -is probably a good idea to standardise how and were to store such -public keys, to make it easier to find for those trying to verify -documents 100 or 1000 years from now. :)

+
+exiftool -headline='The RSS image title' \
+  -description='The RSS image description.' \
+  -subject+=for-family photo.jpeg
+
-

The verification itself is a simple openssl command:

+

I initially tried the "-title" and "keyword" tags, but they were +invisible in digiKam, so I changed to "-headline" and "-subject". I +use the keyword/subject 'for-family' to flag that the photo should be +shared with my family. Images with this keyword set are located and +copied into my Freedombox for the RSS generating script to find.

-

-openssl ts -verify -data $inputfile -in $sha256.tsr \
-  -CAfile ca-cert.txt -text
-

+

Are there better ways to do this? Get in touch if you have better +suggestions.

-

Is there any reason this approach would not work? Is it somehow against -the Noark 5 specification?

+

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -298,61 +225,105 @@ the Noark 5 specification?

- -
3rd June 2017
-

Aftenposten -melder i dag om feil i eksamensoppgavene for eksamen i politikk og -menneskerettigheter, der teksten i bokmåls og nynorskutgaven ikke var -like. Oppgaveteksten er gjengitt i artikkelen, og jeg ble nysgjerring -på om den fri oversetterløsningen -Apertium ville gjort en bedre -jobb enn Utdanningsdirektoratet. Det kan se slik ut.

- -

Her er bokmålsoppgaven fra eksamenen:

- -
-

Drøft utfordringene knyttet til nasjonalstatenes og andre aktørers -rolle og muligheter til å håndtere internasjonale utfordringer, som -for eksempel flykningekrisen.

- -

Vedlegge er eksempler på tekster som kan gi relevante perspektiver -på temaet:

-
    -
  1. Flykningeregnskapet 2016, UNHCR og IDMC -
  2. «Grenseløst Europa for fall» A-Magasinet, 26. november 2015 -
- -
- -

Dette oversetter Apertium slik:

- -
-

Drøft utfordringane knytte til nasjonalstatane sine og rolla til -andre aktørar og høve til å handtera internasjonale utfordringar, som -til dømes *flykningekrisen.

- -

Vedleggja er døme på tekster som kan gje relevante perspektiv på -temaet:

- -
    -
  1. *Flykningeregnskapet 2016, *UNHCR og *IDMC
  2. -
  3. «*Grenseløst Europa for fall» A-Magasinet, 26. november 2015
  4. -
- -
- -

Ord som ikke ble forstått er markert med stjerne (*), og trenger -ekstra språksjekk. Men ingen ord er forsvunnet, slik det var i -oppgaven elevene fikk presentert på eksamen. Jeg mistenker dog at -"andre aktørers rolle og muligheter til ..." burde vært oversatt til -"rolla til andre aktørar og deira høve til ..." eller noe slikt, men -det er kanskje flisespikking. Det understreker vel bare at det alltid -trengs korrekturlesning etter automatisk oversettelse.

+ +
12th July 2018
+

Last night, I wrote +a +recipe to stream a Linux desktop using VLC to a instance of Kodi. +During the day I received valuable feedback, and thanks to the +suggestions I have been able to rewrite the recipe into a much simpler +approach requiring no setup at all. It is a single script that take +care of it all.

+ +

This new script uses GStreamer instead of VLC to capture the +desktop and stream it to Kodi. This fixed the video quality issue I +saw initially. It further removes the need to add a m3u file on the +Kodi machine, as it instead connects to +the JSON-RPC API in +Kodi and simply ask Kodi to play from the stream created using +GStreamer. Streaming the desktop to Kodi now become trivial. Copy +the script below, run it with the DNS name or IP address of the kodi +server to stream to as the only argument, and watch your screen show +up on the Kodi screen. Note, it depend on multicast on the local +network, so if you need to stream outside the local network, the +script must be modified. Also note, I have no idea if audio work, as +I only care about the picture part.

+ +
+#!/bin/sh
+#
+# Stream the Linux desktop view to Kodi.  See
+# http://people.skolelinux.org/pere/blog/Streaming_the_Linux_desktop_to_Kodi_using_VLC_and_RTSP.html
+# for backgorund information.
+
+# Make sure the stream is stopped in Kodi and the gstreamer process is
+# killed if something go wrong (for example if curl is unable to find the
+# kodi server).  Do the same when interrupting this script.
+kodicmd() {
+    host="$1"
+    cmd="$2"
+    params="$3"
+    curl --silent --header 'Content-Type: application/json' \
+	 --data-binary "{ \"id\": 1, \"jsonrpc\": \"2.0\", \"method\": \"$cmd\", \"params\": $params }" \
+	 "http://$host/jsonrpc"
+}
+cleanup() {
+    if [ -n "$kodihost" ] ; then
+	# Stop the playing when we end
+	playerid=$(kodicmd "$kodihost" Player.GetActivePlayers "{}" |
+			    jq .result[].playerid)
+	kodicmd "$kodihost" Player.Stop "{ \"playerid\" : $playerid }" > /dev/null
+    fi
+    if [ "$gstpid" ] && kill -0 "$gstpid" >/dev/null 2>&1; then
+	kill "$gstpid"
+    fi
+}
+trap cleanup EXIT INT
+
+if [ -n "$1" ]; then
+    kodihost=$1
+    shift
+else
+    kodihost=kodi.local
+fi
+
+mcast=239.255.0.1
+mcastport=1234
+mcastttl=1
+
+pasrc=$(pactl list | grep -A2 'Source #' | grep 'Name: .*\.monitor$' | \
+  cut -d" " -f2|head -1)
+gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \
+  videoconvert ! queue2 ! \
+  x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \
+  key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \
+  mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \
+  udpsink host=$mcast port=$mcastport ttl-mc=$mcastttl auto-multicast=1 sync=0 \
+  pulsesrc device=$pasrc ! audioconvert ! queue2 ! avenc_aac ! queue2 ! mux. \
+  > /dev/null 2>&1 &
+gstpid=$!
+
+# Give stream a second to get going
+sleep 1
+
+# Ask kodi to start streaming using its JSON-RPC API
+kodicmd "$kodihost" Player.Open \
+	"{\"item\": { \"file\": \"udp://@$mcast:$mcastport\" } }" > /dev/null
+
+# wait for gst to end
+wait "$gstpid"
+
+ +

I hope you find the approach useful. I know I do.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

- Tags: debian, norsk, stavekontroll. + Tags: debian, english, video.
@@ -360,67 +331,152 @@ trengs korrekturlesning etter automatisk oversettelse.

- -
27th April 2017
-

I disse dager, med frist 1. mai, har Riksarkivaren ute en høring på -sin forskrift. Som en kan se er det ikke mye tid igjen før fristen -som går ut på søndag. Denne forskriften er det som lister opp hvilke -formater det er greit å arkivere i -Noark -5-løsninger i Norge.

- -

Jeg fant høringsdokumentene hos -Norsk -Arkivråd etter å ha blitt tipset på epostlisten til -fri -programvareprosjektet Nikita Noark5-Core, som lager et Noark 5 -Tjenestegresesnitt. Jeg er involvert i Nikita-prosjektet og takket -være min interesse for tjenestegrensesnittsprosjektet har jeg lest en -god del Noark 5-relaterte dokumenter, og til min overraskelse oppdaget -at standard epost ikke er på listen over godkjente formater som kan -arkiveres. Høringen med frist søndag er en glimrende mulighet til å -forsøke å gjøre noe med det. Jeg holder på med -egen -høringsuttalelse, og lurer på om andre er interessert i å støtte -forslaget om å tillate arkivering av epost som epost i arkivet.

- -

Er du igang med å skrive egen høringsuttalelse allerede? I så fall -kan du jo vurdere å ta med en formulering om epost-lagring. Jeg tror -ikke det trengs så mye. Her et kort forslag til tekst:

- -

- -

Viser til høring sendt ut 2017-02-17 (Riksarkivarens referanse - 2016/9840 HELHJO), og tillater oss å sende inn noen innspill om - revisjon av Forskrift om utfyllende tekniske og arkivfaglige - bestemmelser om behandling av offentlige arkiver (Riksarkivarens - forskrift).

- -

Svært mye av vår kommuikasjon foregår i dag på e-post.  Vi - foreslår derfor at Internett-e-post, slik det er beskrevet i IETF - RFC 5322, - https://tools.ietf.org/html/rfc5322. bør - inn som godkjent dokumentformat.  Vi foreslår at forskriftens - oversikt over godkjente dokumentformater ved innlevering i § 5-16 - endres til å ta med Internett-e-post.

- -

- -

Som del av arbeidet med tjenestegrensesnitt har vi testet hvordan -epost kan lagres i en Noark 5-struktur, og holder på å skrive et -forslag om hvordan dette kan gjøres som vil bli sendt over til -arkivverket så snart det er ferdig. De som er interesserte kan -følge -fremdriften på web.

- -

Oppdatering 2017-04-28: I dag ble høringuttalelsen jeg skrev - sendt - inn av foreningen NUUG.

+ +
12th July 2018
+

PS: See +the +followup post for a even better approach.

+ +

A while back, I was asked by a friend how to stream the desktop to +my projector connected to Kodi. I sadly had to admit that I had no +idea, as it was a task I never had tried. Since then, I have been +looking for a way to do so, preferable without much extra software to +install on either side. Today I found a way that seem to kind of +work. Not great, but it is a start.

+ +

I had a look at several approaches, for example +using uPnP +DLNA as described in 2011, but it required a uPnP server, fuse and +local storage enough to store the stream locally. This is not going +to work well for me, lacking enough free space, and it would +impossible for my friend to get working.

+ +

Next, it occurred to me that perhaps I could use VLC to create a +video stream that Kodi could play. Preferably using +broadcast/multicast, to avoid having to change any setup on the Kodi +side when starting such stream. Unfortunately, the only recipe I +could find using multicast used the rtp protocol, and this protocol +seem to not be supported by Kodi.

+ +

On the other hand, the rtsp protocol is working! Unfortunately I +have to specify the IP address of the streaming machine in both the +sending command and the file on the Kodi server. But it is showing my +desktop, and thus allow us to have a shared look on the big screen at +the programs I work on.

+ +

I did not spend much time investigating codeces. I combined the +rtp and rtsp recipes from +the +VLC Streaming HowTo/Command Line Examples, and was able to get +this working on the desktop/streaming end.

+ +
+vlc screen:// --sout \
+  '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{dst=projector.local,port=1234,sdp=rtsp://192.168.11.4:8080/test.sdp}'
+
+ +

I ssh-ed into my Kodi box and created a file like this with the +same IP address:

+ +
+echo rtsp://192.168.11.4:8080/test.sdp \
+  > /storage/videos/screenstream.m3u
+
+ +

Note the 192.168.11.4 IP address is my desktops IP address. As far +as I can tell the IP must be hardcoded for this to work. In other +words, if someone elses machine is going to do the steaming, you have +to update screenstream.m3u on the Kodi machine and adjust the vlc +recipe. To get started, locate the file in Kodi and select the m3u +file while the VLC stream is running. The desktop then show up in my +big screen. :)

+ +

When using the same technique to stream a video file with audio, +the audio quality is really bad. No idea if the problem is package +loss or bad parameters for the transcode. I do not know VLC nor Kodi +enough to tell.

+ +

Update 2018-07-12: Johannes Schauer send me a few +succestions and reminded me about an important step. The "screen:" +input source is only available once the vlc-plugin-access-extra +package is installed on Debian. Without it, you will see this error +message: "VLC is unable to open the MRL 'screen://'. Check the log +for details." He further found that it is possible to drop some parts +of the VLC command line to reduce the amount of hardcoded information. +It is also useful to consider using cvlc to avoid having the VLC +window in the desktop view. In sum, this give us this command line on +the source end + +

+cvlc screen:// --sout \
+  '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{sdp=rtsp://:8080/}'
+
+ +

and this on the Kodi end

+ +

+echo rtsp://192.168.11.4:8080/ \
+  > /storage/videos/screenstream.m3u
+
+ +

Still bad image quality, though. But I did discover that streaming +a DVD using dvdsimple:///dev/dvd as the source had excellent video and +audio quality, so I guess the issue is in the input or transcoding +parts, not the rtsp part. I've tried to change the vb and ab +parameters to use more bandwidth, but it did not make a +difference.

+ +

I further received a suggestion from Einar Haraldseid to try using +gstreamer instead of VLC, and this proved to work great! He also +provided me with the trick to get Kodi to use a multicast stream as +its source. By using this monstrous oneliner, I can stream my desktop +with good video quality in reasonable framerate to the 239.255.0.1 +multicast address on port 1234: + +

+gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \
+  videoconvert ! queue2 ! \
+  x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \
+  key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \
+  mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \
+  udpsink host=239.255.0.1 port=1234 ttl-mc=1 auto-multicast=1 sync=0 \
+  pulsesrc device=$(pactl list | grep -A2 'Source #' | \
+    grep 'Name: .*\.monitor$' |  cut -d" " -f2|head -1) ! \
+  audioconvert ! queue2 ! avenc_aac ! queue2 ! mux.
+
+ +

and this on the Kodi end

+ +

+echo udp://@239.255.0.1:1234 \
+  > /storage/videos/screenstream.m3u
+
+ +

Note the trick to pick a valid pulseaudio source. It might not +pick the one you need. This approach will of course lead to trouble +if more than one source uses the same multicast port and address. +Note the ttl-mc=1 setting, which limit the multicast packages to the +local network. If the value is increased, your screen will be +broadcasted further, one network "hop" for each increase (read up on +multicast to learn more. :)!

+ +

Having cracked how to get Kodi to receive multicast streams, I +could use this VLC command to stream to the same multicast address. +The image quality is way better than the rtsp approach, but gstreamer +seem to be doing a better job.

+ +
+cvlc screen:// --sout '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{mux=ts,dst=239.255.0.1,port=1234,sdp=sap}'
+
+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -428,52 +484,121 @@ fremdriften på web.

- -
20th April 2017
-

Jeg oppdaget i dag at nettstedet som -publiserer offentlige postjournaler fra statlige etater, OEP, har -begynt å blokkerer enkelte typer webklienter fra å få tilgang. Vet -ikke hvor mange det gjelder, men det gjelder i hvert fall libwww-perl -og curl. For å teste selv, kjør følgende:

+ +
9th July 2018
+

Five years ago, +I +measured what the most supported MIME type in Debian was, by +analysing the desktop files in all packages in the archive. Since +then, the DEP-11 AppStream system has been put into production, making +the task a lot easier. This made me want to repeat the measurement, +to see how much things changed. Here are the new numbers, for +unstable only this time: + +

Debian Unstable:

+ +
+  count MIME type
+  ----- -----------------------
+     56 image/jpeg
+     55 image/png
+     49 image/tiff
+     48 image/gif
+     39 image/bmp
+     38 text/plain
+     37 audio/mpeg
+     34 application/ogg
+     33 audio/x-flac
+     32 audio/x-mp3
+     30 audio/x-wav
+     30 audio/x-vorbis+ogg
+     29 image/x-portable-pixmap
+     27 inode/directory
+     27 image/x-portable-bitmap
+     27 audio/x-mpeg
+     26 application/x-ogg
+     25 audio/x-mpegurl
+     25 audio/ogg
+     24 text/html
+
+ +

The list was created like this using a sid chroot: "cat +/var/lib/apt/lists/*sid*_dep11_Components-amd64.yml.gz| zcat | awk '/^ +- \S+\/\S+$/ {print $2 }' | sort | uniq -c | sort -nr | head -20"

+ +

It is interesting to see how image formats have passed text/plain +as the most announced supported MIME type. These days, thanks to the +AppStream system, if you run into a file format you do not know, and +want to figure out which packages support the format, you can find the +MIME type of the file using "file --mime <filename>", and then +look up all packages announcing support for this format in their +AppStream metadata (XML or .desktop file) using "appstreamcli +what-provides mimetype <mime-type>. For example if you, like +me, want to know which packages support inode/directory, you can get a +list like this:

-
-% curl -v -s https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
-< HTTP/1.1 404 Not Found
-% curl -v -s --header 'User-Agent:Opera/12.0' https://www.oep.no/pub/report.xhtml?reportId=3 2>&1 |grep '< HTTP'
-< HTTP/1.1 200 OK
+

+% appstreamcli what-provides mimetype inode/directory | grep Package: | sort
+Package: anjuta
+Package: audacious
+Package: baobab
+Package: cervisia
+Package: chirp
+Package: dolphin
+Package: doublecmd-common
+Package: easytag
+Package: enlightenment
+Package: ephoto
+Package: filelight
+Package: gwenview
+Package: k4dirstat
+Package: kaffeine
+Package: kdesvn
+Package: kid3
+Package: kid3-qt
+Package: nautilus
+Package: nemo
+Package: pcmanfm
+Package: pcmanfm-qt
+Package: qweborf
+Package: ranger
+Package: sirikali
+Package: spacefm
+Package: spacefm
+Package: vifm
 %
-
+

+ +

Using the same method, I can quickly discover that the Sketchup file +format is not yet supported by any package in Debian:

+ +

+% appstreamcli what-provides mimetype  application/vnd.sketchup.skp
+Could not find component providing 'mimetype::application/vnd.sketchup.skp'.
+%
+

+ +

Yesterday I used it to figure out which packages support the STL 3D +format:

+ +

+% appstreamcli what-provides mimetype  application/sla|grep Package
+Package: cura
+Package: meshlab
+Package: printrun
+%
+

-

Her kan en se at tjenesten gir «404 Not Found» for curl i -standardoppsettet, mens den gir «200 OK» hvis curl hevder å være Opera -versjon 12.0. Offentlig elektronisk postjournal startet blokkeringen -2017-03-02.

- -

Blokkeringen vil gjøre det litt vanskeligere å maskinelt hente -informasjon fra oep.no. Kan blokkeringen være gjort for å hindre -automatisert innsamling av informasjon fra OEP, slik Pressens -Offentlighetsutvalg gjorde for å dokumentere hvordan departementene -hindrer innsyn i -rapporten -«Slik hindrer departementer innsyn» som ble publiserte i januar -2017. Det virker usannsynlig, da det jo er trivielt å bytte -User-Agent til noe nytt.

- -

Finnes det juridisk grunnlag for det offentlige å diskriminere -webklienter slik det gjøres her? Der tilgang gis eller ikke alt etter -hva klienten sier at den heter? Da OEP eies av DIFI og driftes av -Basefarm, finnes det kanskje noen dokumenter sendt mellom disse to -aktørene man kan be om innsyn i for å forstå hva som har skjedd. Men -postjournalen -til DIFI viser kun to dokumenter det siste året mellom DIFI og -Basefarm. -Mimes brønn neste, -tenker jeg.

+

PS: A new version of Cura was uploaded to Debian yesterday.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -481,101 +606,83 @@ tenker jeg.

- -
19th March 2017
-

The Nikita -Noark 5 core project is implementing the Norwegian standard for -keeping an electronic archive of government documents. -The -Noark 5 standard document the requirement for data systems used by -the archives in the Norwegian government, and the Noark 5 web interface -specification document a REST web service for storing, searching and -retrieving documents and metadata in such archive. I've been involved -in the project since a few weeks before Christmas, when the Norwegian -Unix User Group -announced -it supported the project. I believe this is an important project, -and hope it can make it possible for the government archives in the -future to use free software to keep the archives we citizens depend -on. But as I do not hold such archive myself, personally my first use -case is to store and analyse public mail journal metadata published -from the government. I find it useful to have a clear use case in -mind when developing, to make sure the system scratches one of my -itches.

- -

If you would like to help make sure there is a free software -alternatives for the archives, please join our IRC channel -(#nikita on -irc.freenode.net) and -the -project mailing list.

- -

When I got involved, the web service could store metadata about -documents. But a few weeks ago, a new milestone was reached when it -became possible to store full text documents too. Yesterday, I -completed an implementation of a command line tool -archive-pdf to upload a PDF file to the archive using this -API. The tool is very simple at the moment, and find existing -fonds, series and -files while asking the user to select which one to use if more than -one exist. Once a file is identified, the PDF is associated with the -file and uploaded, using the title extracted from the PDF itself. The -process is fairly similar to visiting the archive, opening a cabinet, -locating a file and storing a piece of paper in the archive. Here is -a test run directly after populating the database with test data using -our API tester:

+ +
8th July 2018
+

Quite regularly, I let my Debian Sid/Unstable chroot stay untouch +for a while, and when I need to update it there is not enough free +space on the disk for apt to do a normal 'apt upgrade'. I normally +would resolve the issue by doing 'apt install <somepackages>' to +upgrade only some of the packages in one batch, until the amount of +packages to download fall below the amount of free space available. +Today, I had about 500 packages to upgrade, and after a while I got +tired of trying to install chunks of packages manually. I concluded +that I did not have the spare hours required to complete the task, and +decided to see if I could automate it. I came up with this small +script which I call 'apt-in-chunks':

-~/src//noark5-tester$ ./archive-pdf mangelmelding/mangler.pdf
-using arkiv: Title of the test fonds created 2017-03-18T23:49:32.103446
-using arkivdel: Title of the test series created 2017-03-18T23:49:32.103446
-
- 0 - Title of the test case file created 2017-03-18T23:49:32.103446
- 1 - Title of the test file created 2017-03-18T23:49:32.103446
-Select which mappe you want (or search term): 0
-Uploading mangelmelding/mangler.pdf
-  PDF title: Mangler i spesifikasjonsdokumentet for NOARK 5 Tjenestegrensesnitt
-  File 2017/1: Title of the test case file created 2017-03-18T23:49:32.103446
-~/src//noark5-tester$
+#!/bin/sh
+#
+# Upgrade packages when the disk is too full to upgrade every
+# upgradable package in one lump.  Fetching packages to upgrade using
+# apt, and then installing using dpkg, to avoid changing the package
+# flag for manual/automatic.
+
+set -e
+
+ignore() {
+    if [ "$1" ]; then
+	grep -v "$1"
+    else
+	cat
+    fi
+}
+
+for p in $(apt list --upgradable | ignore "$@" |cut -d/ -f1 | grep -v '^Listing...'); do
+    echo "Upgrading $p"
+    apt clean
+    apt install --download-only -y $p
+    for f in /var/cache/apt/archives/*.deb; do
+	if [ -e "$f" ]; then
+	    dpkg -i /var/cache/apt/archives/*.deb
+	    break
+	fi
+    done
+done
 

-

You can see here how the fonds (arkiv) and serie (arkivdel) only had -one option, while the user need to choose which file (mappe) to use -among the two created by the API tester. The archive-pdf -tool can be found in the git repository for the API tester.

- -

In the project, I have been mostly working on -the API -tester so far, while getting to know the code base. The API -tester currently use -the HATEOAS links -to traverse the entire exposed service API and verify that the exposed -operations and objects match the specification, as well as trying to -create objects holding metadata and uploading a simple XML file to -store. The tester has proved very useful for finding flaws in our -implementation, as well as flaws in the reference site and the -specification.

- -

The test document I uploaded is a summary of all the specification -defects we have collected so far while implementing the web service. -There are several unclear and conflicting parts of the specification, -and we have -started -writing down the questions we get from implementing it. We use a -format inspired by how The -Austin Group collect defect reports for the POSIX standard with -their -instructions for the MANTIS defect tracker system, in lack of an official way to structure defect reports for Noark 5 (our first submitted defect report was a request for a procedure for submitting defect reports :). - -

The Nikita project is implemented using Java and Spring, and is -fairly easy to get up and running using Docker containers for those -that want to test the current code base. The API tester is -implemented in Python.

+

The script will extract the list of packages to upgrade, try to +download the packages needed to upgrade one package, install the +downloaded packages using dpkg. The idea is to upgrade packages +without changing the APT mark for the package (ie the one recording of +the package was manually requested or pulled in as a dependency). To +use it, simply run it as root from the command line. If it fail, try +'apt install -f' to clean up the mess and run the script again. This +might happen if the new packages conflict with one of the old +packages. dpkg is unable to remove, while apt can do this.

+ +

It take one option, a package to ignore in the list of packages to +upgrade. The option to ignore a package is there to be able to skip +the packages that are simply too large to unpack. Today this was +'ghc', but I have run into other large packages causing similar +problems earlier (like TeX).

+ +

Update 2018-07-08: Thanks to Paul Wise, I am aware of two +alternative ways to handle this. The "unattended-upgrades +--minimal-upgrade-steps" option will try to calculate upgrade sets for +each package to upgrade, and then upgrade them in order, smallest set +first. It might be a better option than my above mentioned script. +Also, "aptutude upgrade" can upgrade single packages, thus avoiding +the need for using "dpkg -i" in the script above.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -583,114 +690,32 @@ implemented in Python.

- -
9th March 2017
-

Over the years, administrating thousand of NFS mounting linux -computers at the time, I often needed a way to detect if the machine -was experiencing NFS hang. If you try to use df or look at a -file or directory affected by the hang, the process (and possibly the -shell) will hang too. So you want to be able to detect this without -risking the detection process getting stuck too. It has not been -obvious how to do this. When the hang has lasted a while, it is -possible to find messages like these in dmesg:

- -

-nfs: server nfsserver not responding, still trying -
nfs: server nfsserver OK -

- -

It is hard to know if the hang is still going on, and it is hard to -be sure looking in dmesg is going to work. If there are lots of other -messages in dmesg the lines might have rotated out of site before they -are noticed.

- -

While reading through the nfs client implementation in linux kernel -code, I came across some statistics that seem to give a way to detect -it. The om_timeouts sunrpc value in the kernel will increase every -time the above log entry is inserted into dmesg. And after digging a -bit further, I discovered that this value show up in -/proc/self/mountstats on Linux.

- -

The mountstats content seem to be shared between files using the -same file system context, so it is enough to check one of the -mountstats files to get the state of the mount point for the machine. -I assume this will not show lazy umounted NFS points, nor NFS mount -points in a different process context (ie with a different filesystem -view), but that does not worry me.

- -

The content for a NFS mount point look similar to this:

- -

-[...]
-device /dev/mapper/Debian-var mounted on /var with fstype ext3
-device nfsserver:/mnt/nfsserver/home0 mounted on /mnt/nfsserver/home0 with fstype nfs statvers=1.1
-        opts:   rw,vers=3,rsize=65536,wsize=65536,namlen=255,acregmin=3,acregmax=60,acdirmin=30,acdirmax=60,soft,nolock,proto=tcp,timeo=600,retrans=2,sec=sys,mountaddr=129.240.3.145,mountvers=3,mountport=4048,mountproto=udp,local_lock=all
-        age:    7863311
-        caps:   caps=0x3fe7,wtmult=4096,dtsize=8192,bsize=0,namlen=255
-        sec:    flavor=1,pseudoflavor=1
-        events: 61063112 732346265 1028140 35486205 16220064 8162542 761447191 71714012 37189 3891185 45561809 110486139 4850138 420353 15449177 296502 52736725 13523379 0 52182 9016896 1231 0 0 0 0 0 
-        bytes:  166253035039 219519120027 0 0 40783504807 185466229638 11677877 45561809 
-        RPC iostats version: 1.0  p/v: 100003/3 (nfs)
-        xprt:   tcp 925 1 6810 0 0 111505412 111480497 109 2672418560317 0 248 53869103 22481820
-        per-op statistics
-                NULL: 0 0 0 0 0 0 0 0
-             GETATTR: 61063106 61063108 0 9621383060 6839064400 453650 77291321 78926132
-             SETATTR: 463469 463470 0 92005440 66739536 63787 603235 687943
-              LOOKUP: 17021657 17021657 0 3354097764 4013442928 57216 35125459 35566511
-              ACCESS: 14281703 14290009 5 2318400592 1713803640 1709282 4865144 7130140
-            READLINK: 125 125 0 20472 18620 0 1112 1118
-                READ: 4214236 4214237 0 715608524 41328653212 89884 22622768 22806693
-               WRITE: 8479010 8494376 22 187695798568 1356087148 178264904 51506907 231671771
-              CREATE: 171708 171708 0 38084748 46702272 873 1041833 1050398
-               MKDIR: 3680 3680 0 773980 993920 26 23990 24245
-             SYMLINK: 903 903 0 233428 245488 6 5865 5917
-               MKNOD: 80 80 0 20148 21760 0 299 304
-              REMOVE: 429921 429921 0 79796004 61908192 3313 2710416 2741636
-               RMDIR: 3367 3367 0 645112 484848 22 5782 6002
-              RENAME: 466201 466201 0 130026184 121212260 7075 5935207 5961288
-                LINK: 289155 289155 0 72775556 67083960 2199 2565060 2585579
-             READDIR: 2933237 2933237 0 516506204 13973833412 10385 3190199 3297917
-         READDIRPLUS: 1652839 1652839 0 298640972 6895997744 84735 14307895 14448937
-              FSSTAT: 6144 6144 0 1010516 1032192 51 9654 10022
-              FSINFO: 2 2 0 232 328 0 1 1
-            PATHCONF: 1 1 0 116 140 0 0 0
-              COMMIT: 0 0 0 0 0 0 0 0
-
-device binfmt_misc mounted on /proc/sys/fs/binfmt_misc with fstype binfmt_misc
-[...]
-

- -

The key number to look at is the third number in the per-op list. -It is the number of NFS timeouts experiences per file system -operation. Here 22 write timeouts and 5 access timeouts. If these -numbers are increasing, I believe the machine is experiencing NFS -hang. Unfortunately the timeout value do not start to increase right -away. The NFS operations need to time out first, and this can take a -while. The exact timeout value depend on the setup. For example the -defaults for TCP and UDP mount points are quite different, and the -timeout value is affected by the soft, hard, timeo and retrans NFS -mount options.

- -

The only way I have been able to get working on Debian and RedHat -Enterprise Linux for getting the timeout count is to peek in /proc/. -But according to -Solaris -10 System Administration Guide: Network Services, the 'nfsstat -c' -command can be used to get these timeout values. But this do not work -on Linux, as far as I can tell. I -asked Debian about this, -but have not seen any replies yet.

- -

Is there a better way to figure out if a Linux NFS client is -experiencing NFS hangs? Is there a way to detect which processes are -affected? Is there a way to get the NFS mount going quickly once the -network problem causing the NFS hang has been cleared? I would very -much welcome some clues, as we regularly run into NFS hangs.

+ +
30th June 2018
+

So far, at least hydro-electric power, coal power, wind power, +solar power, and wood power are well known. Until a few days ago, I +had never heard of stone power. Then I learn about a quarry in a +mountain in +Bremanger i +Norway, where +the +Bremanger Quarry company is extracting stone and dumping the stone +into a shaft leading to its shipping harbour. This downward movement +in this shaft is used to produce electricity. In short, it is using +falling rocks instead of falling water to produce electricity, and +according to its own statements it is producing more power than it is +using, and selling the surplus electricity to the Norwegian power +grid. I find the concept truly amazing. Is this the worlds only +stone power plant?

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

- Tags: debian, english, sysadmin. + Tags: english.
@@ -698,44 +723,66 @@ much welcome some clues, as we regularly run into NFS hangs.

- -
8th March 2017
-

So the new president in the United States of America claim to be -surprised to discover that he was wiretapped during the election -before he was elected president. He even claim this must be illegal. -Well, doh, if it is one thing the confirmations from Snowden -documented, it is that the entire population in USA is wiretapped, one -way or another. Of course the president candidates were wiretapped, -alongside the senators, judges and the rest of the people in USA.

- -

Next, the Federal Bureau of Investigation ask the Department of -Justice to go public rejecting the claims that Donald Trump was -wiretapped illegally. I fail to see the relevance, given that I am -sure the surveillance industry in USA believe they have all the legal -backing they need to conduct mass surveillance on the entire -world.

- -

There is even the director of the FBI stating that he never saw an -order requesting wiretapping of Donald Trump. That is not very -surprising, given how the FISA court work, with all its activity being -secret. Perhaps he only heard about it?

- -

What I find most sad in this story is how Norwegian journalists -present it. In a news reports the other day in the radio from the -Norwegian National broadcasting Company (NRK), I heard the journalist -claim that 'the FBI denies any wiretapping', while the reality is that -'the FBI denies any illegal wiretapping'. There is a fundamental and -important difference, and it make me sad that the journalists are -unable to grasp it.

- -

Update 2017-03-13: Look like -The -Intercept report that US Senator Rand Paul confirm what I state above.

+ +
26th June 2018
+

My movie playing setup involve Kodi, +OpenELEC (probably soon to be +replaced with LibreELEC) and an +Infocus IN76 video projector. My projector can be controlled via both +a infrared remote controller, and a RS-232 serial line. The vendor of +my projector, InFocus, had been +sensible enough to document the serial protocol in its user manual, so +it is easily available, and I used it some years ago to write +a +small script to control the projector. For a while now, I longed +for a setup where the projector was controlled by Kodi, for example in +such a way that when the screen saver went on, the projector was +turned off, and when the screen saver exited, the projector was turned +on again.

+ +

A few days ago, with very good help from parts of my family, I +managed to find a Kodi Add-on for controlling a Epson projector, and +got in touch with its author to see if we could join forces and make a +Add-on with support for several projectors. To my pleasure, he was +positive to the idea, and we set out to add InFocus support to his +add-on, and make the add-on suitable for the official Kodi add-on +repository.

+ +

The Add-on is now working (for me, at least), with a few minor +adjustments. The most important change I do relative to the master +branch in the github repository is embedding the +pyserial module in +the add-on. The long term solution is to make a "script" type +pyserial module for Kodi, that can be pulled in as a dependency in +Kodi. But until that in place, I embed it.

+ +

The add-on can be configured to turn on the projector when Kodi +starts, off when Kodi stops as well as turn the projector off when the +screensaver start and on when the screesaver stops. It can also be +told to set the projector source when turning on the projector. + +

If this sound interesting to you, check out +the +project github repository. Perhaps you can send patches to +support your projector too? As soon as we find time to wrap up the +latest changes, it should be available for easy installation using any +Kodi instance.

+ +

For future improvements, I would like to add projector model +detection and the ability to adjust the brightness level of the +projector from within Kodi. We also need to figure out how to handle +the cooling period of the projector. My projector refuses to turn on +for 60 seconds after it was turned off. This is not handled well by +the add-on at the moment.

+ +

As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

@@ -750,6 +797,25 @@ Intercept report that US Senator Rand Paul confirm what I state above.

Archive