A few days ago, I wondered if there are any privacy respecting -health monitors and/or fitness trackers available for sale these days. -I would like to buy one, but do not want to share my personal data -with strangers, nor be forced to have a mobile phone to get data out -of the unit. I've received some ideas, and would like to share them -with you. - -One interesting data point was a pointer to a Free Software app for -Android called -Gadgetbridge. -It provide cloudless collection and storing of data from a variety of -trackers. Its -list -of supported devices is a good indicator for units where the -protocol is fairly open, as it is obviously being handled by Free -Software. Other units are reportedly encrypting the collected -information with their own public key, making sure only the vendor -cloud service is able to extract data from the unit. The people -contacting me about it said they were using -Amazfit -Bip and -Xiaomi -Band 3.
- -I also got a suggestion to look at some of the units from Garmin. -I was told their GPS watches can be connected via USB and show up as a -USB storage device with -Garmin -FIT files containing the collected measurements. While -proprietary, FIT files apparently can be read at least by -GPSBabel and the -GpxPod Nextcloud -app. It is unclear to me if they can read step count and heart rate -data. The person I talked to was using a Garmin -Garmin Forerunner -935, which is a fairly expensive unit. I doubt it is worth it for -a unit where the vendor clearly is trying its best to move from open -to closed systems. I still remember when Garmin dropped NMEA support -in its GPSes.
- -A final idea was to build ones own unit, perhaps by basing it on a -wearable hardware platforms like -the Flora Geo -Watch. Sound like fun, but I had more money than time to spend on -the topic, so I suspect it will have to wait for another time.
- -While I was working on tracking down links, I came across an -inspiring TED talk by Dave Debronkart about -being a -e-patient, and discovered the web site -Participatory -Medicine. If you too want to track your own health and fitness -without having information about your private life floating around on -computers owned by others, I recommend checking it out.
+ +Childs need to learn how to guard their privacy too. To help them, +European Digital Rights (EDRi) created +a colorful booklet providing information on several privacy related topics, +and tips on how to protect ones privacy in the digital age.
+ +The 24 page booklet titled Digital Defenders is +available +in several languages. Thanks to the valuable contributions from +members of the Electronic Foundation Norway +(EFN) and others, it is also available in Norwegian Bokmål. +If you would like to have it available in your language too, +contribute +via Weblate and get in touch.
+ +But a funny, well written and good looking PDF do not have much +impact, unless it is read by the right audience. To increase the +chance of kids reading it, I am currently assisting EFN in getting +copies printed on paper to distribute on the street and in class +rooms. Print the booklet was made possible thanks to a small et of +great sponsors. Thank you very much to each and every one of them! I +hope to have the printed booklet ready to hand out on Tuesday, when +the Norwegian Unix Users Group is +organizing its yearly +barbecue for geeks and free software zealots in the Oslo area. If +you are nearby, feel free to come by and check out the party and the +booklet.
+ +If the booklet prove to be a success, it would be great to get +more sponsoring and distribute it to every kid in the country. :)
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -84,7 +59,7 @@ activities, please send Bitcoin donations to my address
@@ -92,36 +67,228 @@ activities, please send Bitcoin donations to my addressDear lazyweb,
- -I wonder, is there a fitness tracker / health monitor available for -sale today that respect the users privacy? With this I mean a -watch/bracelet capable of measuring pulse rate and other -fitness/health related values (and by all means, also the correct time -and location if possible), which is only provided for -me to extract/read from the unit with computer without a radio beacon -and Internet connection. In other words, it do not depend on a cell -phone app, and do make the measurements available via other peoples -computer (aka "the cloud"). The collected data should be available -using only free software. I'm not interested in depending on some -non-free software that will leave me high and dry some time in the -future. I've been unable to find any such unit. I would like to buy -it. The ones I have seen for sale here in Norway are proud to report -that they share my health data with strangers (aka "cloud enabled"). -Is there an alternative? I'm not interested in giving money to people -requiring me to accept "privacy terms" to allow myself to measure my -own health.
- -As usual, if you use Bitcoin and want to show your support of my -activities, please send Bitcoin donations to my address +
+av Thomas Sødring (OsloMet) og Petter Reinholdtsen (foreningen +NUUG)
+ +Nikita Noark 5-kjerne er et fri programvareprosjekt som tar i bruk +Arkivverkets spesifikasjonen for Noark 5 Tjenestegrensesnitt og tilbyr +et maskinlesbart grensesnitt (arkiv-API) til datasystemer som trenger å +arkivere dokumenter og informasjon. I tillegg tilbyr Nikita et +nettleserbasert brukergrensesnitt for brukere av arkivet. Dette +brukergrensesnittet benytter det maskinlesbare grensesnittet. Noark 5 +Tjenestegrensesnitt er en ny måte å tenke arkivering, med fokus på +automatisering og maskinell behandling av arkivmateriale, i stedet for +å fokusere på brukergrensesnitt. En kan tenke på +tjenestegrensesnittet som arkivet uten brukergrensesnitt, der flere +aktører kan koble til ulike brukergrensesnitt, tilpasset ulike +behov.
+ +Historisk sett gjorde Noark standarden en veldig bra jobb med +overgangen fra +papir til digital saksbehandling, men det har kommet til kort på andre +områder. Den teknologiske utviklingen har brakt oss ditt at vi kan og +skal forvente langt mer fra en arkivkjerne enn før, men det offentlig +er ofte konservativ når det gjelder nytenking. For lengst skulle +begreper som samvirke mellom datasystemer, metadata, prosess og +tjenestegrensesnitt (API) vært dominerende når systemer kjøpes +inn. Dessverre er det slik at ikke alle ønsker samvirke mellom +datasystemer velkommen, og det kan være trygt å kjøpe «svarte bokser» +der du slipper å ta stilling til hvordan man skal få flere systemer +til å virke sammen. Men IT-arkitektur er et begrep arkivfolk også +begynner å ta inn over seg.
+ +Slike systemer for å organisere metadata bør ha nettbaserte +tjenestegrensesnitt der brukergrensesnitt er tydelig adskilt fra +bakenforliggende system. Det finnes mange rapporter som snakker om å +bryte ned siloer i forvaltningen og standardiserte tjenestegrensesnitt +er det viktigste virkemiddel mot datasiloer og legger til rette for +økt samvirke mellom systemer. Et standardisert tjenestegrensesnitt er +et viktig middel for å få systemer til å samhandle da det sikrer at +ulike produsenters systemer kan snakke sammen på tvers. Samfunnet +fungerer ikke uten standardisering. Vi har alle samme strømstyrke og +kontakter i veggene og kjører alle på høyre side av veien i Norge. Det er i en slik +sammenheng at prosjektet «Noark 5 Tjenestegrensesnitt» er veldig +viktig. Hvis alle leverandører av arkivsystemer forholdt seg til et +standardisert tjenestegrensesnitt kunne kostnadene for arkivering +reduseres. Tenk deg at du er en kommune som ønsker et fagsystem integrert +med arkivløsningen din. I dag må fagsystemleverandøren vite og +tilpasse seg den spesifikke versjonen og varianten av arkivløsningen +du har. Hvis vi antar at alle leverandører av arkivkjerner har solgt +inn enten SOAP eller REST-grensesnitt til kunder de siste 10 årene og +det kommer endret versjon av grensesnittet innimellom, så gir det +veldig mange forskjellige tjenestegrensesnitt en fagsystemleverandør +må forholde seg til. Med 12 leverandører og kvartalsvise oppdateringer +kan det potensielt bli 96 ulike varianter hvert eneste år. Det sier +seg selv at det blir dyrt. Men det blir faktisk verre. Hvis du senere +ønsker å bytte ut arkivsystemet med et annet så er du avhengig å få +alle integrasjonene dine laget på nytt. Dette kan gjøre at du velger å +forbli hos en dårlig leverandør framfor å skaffe nytt system, fordi +det blir for vanskelig og dyrt å bytte. Dermed etableres det «små» +monopolsituasjoner som er vanskelig å bryte ut av. Dårlige valg i dag +kan ha uante kostander på sikt. I Nikita-prosjektet har vi kun jobbet +opp mot Noark 5 Tjenestegrensesnittet. Det har tatt en god del +ressurser å sette seg inn i spesifikasjonen og ta den i bruk, spesielt +på grunn av uklarheter i spesifikasjonen. Hvis vi måtte gjøre det +samme for alle versjoner og varianter av de forskjellige +tjenestegrensesnittene ville det blitt veldig tidkrevende og +kostbart.
+ +For deg som arkivar er digitalisering og systemer som skal virke +sammen en del av den nye hverdagen. Du har kanskje blitt skånet for +det ved å kjøpe svarte bokser, men du risikerer at du gjør deg selv en +bjørnetjeneste. Det kan oppleves som kjedelig å fortelle kolleger at +du skal sette deg inn i et tjenestegrensesnitt, men dette er faktisk +veldig spennende. Tjenestegrensesnittet er på en måte blitt levende og +det er spesielt et begrep du bør merke deg: OData. à trekke inn deler +av OData-standarden som en måte å filtrere entitetsøk i et arkivsystem +var et nyttig trekk i prosjektet. Følgende eksempel er en +OData-spørring det går an å sende inn til en standardisert +arkivkjerne:
+ ++.../sakarkiv/journalpost?filter=contains(tittel, 'nabovarsel') ++ +
Spørringen over vil hente en liste av alle dine journalposter der +tittelen til journalposten inneholder ordet 'nabovarsel'. Alle +leverandører som implementerer tjenestegrensesnittet vil måtte tilby +dette. Det betyr at hvis du lærer dette språket for et system, vil det +være gjeldende for alle. Dette er egentlig en ny måte å søke i +arkivdatabasen på og vil være svært nyttig, for eksempel kan søk i +tjenestegrensesnittet antagelig brukes til å hente ut offentlig +postjournal. I arkivverden pleier vi å like teknologier som er +menneskelesbart, da vet vi det er enkelt og nyttig! OData er også +viktig fordi det kan bli en ny måte å svare innsynsforespørsler på i +tråd med offentlighetsloven § 9, der retten til å kreve innsyn i +sammenstilling fra databaser er nedfelt. I dag ser vi +forvaltningsorganer som avviser slike krav fordi det «ikke kan gjøres +med enkle framgangsmåter». Bruken av OData i tjenestegrensesnittet, +sammen med maskinlesbar markeringsformater kan være et viktig bidrag +til å åpne arkivene i tråd med prinsippene om en åpen og transparent +forvaltning.
+ +Standardisering er viktig fordi det kan sikre samvirke. +Men den effekten kommer kun hvis standardiseringen sikrer at alle +forstår standarden på samme måte, dvs. at den er entydig og klar. En +god måte å sikre en entydig og klar spesifikasjon er ved å kreve at +det finnes minst to ulike implementasjoner som følger spesifikasjonen +og som kan snakke sammen, det vil si at de snakker samme språk, slik +IETF krever for alle sine standarder, før spesifikasjonen anses å være +ferdig. Tilbakemelding fra miljøet forteller at både leverandører og +kunder har et avslappet forhold til Noark 5 Tjenestegrensesnitt og det +er så langt kun Evry som har visst offentlig at de har en +implementasjon av tjenestegrensesnittet. Evry, HK Data og Fredrikstad +kommune er igang med et pilotprosjekt på Noark 5 +Tjenestegrensesnitt. For å redusere kostnadene for samvirkende +datasystemer betraktelig, er det veldig viktig at vi kommer i en +situasjon der alle leverandører har sine egne implementasjoner av +tjenestegrensesnittet, og at disse oppfører seg likt og i tråd med det +som er beskrevet i spesifikasjonen.
+ +Det er her fri programvare spiller en viktig rolle. Med en uklar +standard blir det som en polsk riksdag, der ingenting fungerer. Nikita +er en fri programvareimplementasjon av tjenestegrensesnitt og kan +fungere som teknisk referanse slik at leverandører enklere kan se og +forstå hvordan standarden skal tolkes. Vi har i Nikitaprosjektet +erfart å ende opp med vidt forskjellige tolkninger når +prosjektmedlemmene leser spesifikasjonsteksten, en effekt av en uklar +spesifikasjon. Men Nikitaprosjektet har også utviklet et test-program +som sjekker om et tjenestegrensesnitt er i samsvar med standarden, og +prosjektet bruker det hele tiden for å sikre at endringer og +forbedringer fungerer. Egenerklæringsskjemaenes dager kan være talte! +Snart vil du selv kunne teste hver oppdatering av arkivsystemet med en +uavhengig sjekk.
+ +Fri programvare representerer en demokratisering av kunnskap der +tolkning- og innlåsingsmakt flyttes fra leverandør til allmenheten. +Med fri programvare har du en litt annerledes verdikjede, der selve +produktet ikke holdes hemmelig for å tjene penger, slik en gjør med +ufri programvare og skytjenester som ikke bruker fri programvare, men +du kan tjene penger på andre deler av verdikjeden. Med fri programvare +kan samfunnet betale for å videreutvikle nyttig +fellesfunksjonalitet.
+ +Nikita er en fri programvareimplementasjon av tjenestegrensesnittet og +kan fungere som en referanseimplementasjon dersom det er ønskelig. +Alle har lik tilgang til koden og det koster ingenting å ta den i bruk +og utforske det. Nikitaprosjektet ønsker tjenestegrensesnittet +velkommen og stiller veldig gjerne opp i diskusjoner om tolkning av +tjenestegrensesnittet. Nikita er bygget på moderne +programmeringsrammeverk og utviklet i full åpenhet. Men Nikita er ikke +noe du kan kjøpe. Nikita er først og fremst et verktøy for forsking og +utvikling laget for å fremme forskning på arkivfeltet. Systemer som +virker sammen har alltid vært hovedfokus og vil være det fremover. +Det brukes som undervisningsverktøy der studentene ved OsloMet lærer +om administrativt oppsett, saksbehandling, uttrekk og samvirkende +datasystemer. Det brukes også som forskningsobjekt der vi ser på +import av dokumentsamlinger, bruk av blokkjede og andre nyskapende +måter å tenke arkiv på. Det er dog helt greit om andre tar Nikita og +pakker det for å selge det som produkt. Forvaltningsorganer med +sterke drift- og utviklingsmiljøer kan også se på Nikita og utforske +hva som er mulig. Dette kan de gjøre uten å måtte betale for +bruksrettigheter eller tilgang til konsulenter. Men arkivering blir +ikke gratis på grunn av Nikita. Det trengs fortsatt folk med +kompetanse og tid til å ta i bruk Nikita.
+ +Nikita har nylig kommet med en ny utgave, den sjette i rekken. +Systemet er ikke ferdig, mest på grunn av at API-spesifikasjonen for +Noark 5 Tjenestegrensesnitt ikke er ferdig, men allerede i dag kan en +bruke Nikita som arkiv. Vi har laget eksempelsystem for å importere +data fra deponi-XML og slik gjøre eksisterende arkivdata tilgjengelig +via et API. Vi har også laget en testklient som importerer epost inn +i arkivet med vedlegg der epostenes trådinformasjon brukes til å legge +eposttråder i samme arkivmappe, og en annen testklient som henter +epost ut av en arkivmappe på mbox-format slik at en vanlig epostklient +kan brukes til å lese igjennom og svare på epostene i en +arkivmappe. De som vil ta en titt på Nikita kan besøke +https://nikita.oslomet.no og +logge inn med brukernavn «admin@example.com» og passord «password». +Dette gir tilgang til det forenklede brukergrensesnittet som brukes +til undervisning. De som heller vil ta en titt under panseret kan +besøke +https://nikita.oslomet.no/browse.html +og der se hvordan API-et fungerer mer i detalj. Innloggingsdetaljer +her er det samme som for brukergrensesnittet.
+ +Fremover er fokuset på forbedring av spesifikasjonen Noark 5 +Tjenestegrensesnitt. De som skrev tjenestegrensesnittet gjorde et +interessant og framtidsrettet grep, de skilte sak fra arkiv. +Tjenestegrensesnittet består av flere "pakker", der noen er +grunnleggende mens andre bygger på de grunnleggende pakkene. Pakkene +som er beskrevet så langt heter «arkivstruktur», «sakarkiv», +«administrasjon», «loggogsporing» og «moeter» (dessverre +planlagt +fjernet i første utgave). Etter hvert håper vi å utforske +prosses- og metadatabeskrivelser til flere fagområder og bidra til at +tjenestegrensesnittet kan legge til flere pakker som «byggarkiv», +«barnevern», «personal», «barnehage», der arkivfaglig metadata- og +dokumentasjonsbehov er kartlagt og standardisert.
+ +Nikita utvikles av en liten prosjektgruppe, og vi er alltid +interessert å bli flere. Hvis en åpen, fri og standardisert tilnærming +til arkivering høres interessant ut, bli med oss på veien videre. Vi +er tilstede på IRC-kanalen #nikita hos FreeNode (tilgjengelig via +nettleser på +https://webchat.freenode.net?channels=#nikita), +og har en e-postliste nikita-noark@nuug.no hos NUUG (tilgjengelig for +påmelding og arkiv på +https://lists.nuug.no/mailman/listinfo/nikita-noark) +der en kan følge med eller være med oss på den spennende veien videre. +Spesifikasjonen for Noark 5 Tjenestegrensesnitt vedlikeholdes på +github, +https://github.com/arkivverket/noark5-tjenestegrensesnitt-standard/.
+ +Som vanlig, hvis du bruker Bitcoin og ønsker å vise din støtte til +det jeg driver med, setter jeg pris på om du sender Bitcoin-donasjoner +til min adresse 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
For a while now, I have looked for a sensible way to share images -with my family using a self hosted solution, as it is unacceptable to -place images from my personal life under the control of strangers -working for data hoarders like Google or Dropbox. The last few days I -have drafted an approach that might work out, and I would like to -share it with you. I would like to publish images on a server under -my control, and point some Internet connected display units using some -free and open standard to the images I published. As my primary -language is not limited to ASCII, I need to store metadata using -UTF-8. Many years ago, I hoped to find a digital photo frame capable -of reading a RSS feed with image references (aka using the -<enclosure> RSS tag), but was unable to find a current supplier -of such frames. In the end I gave up that approach.
- -Some months ago, I discovered that -XScreensaver is able to -read images from a RSS feed, and used it to set up a screen saver on -my home info screen, showing images from the Daily images feed from -NASA. This proved to work well. More recently I discovered that -Kodi (both using -OpenELEC and -LibreELEC) provide the -Feedreader -screen saver capable of reading a RSS feed with images and news. For -fun, I used it this summer to test Kodi on my parents TV by hooking up -a Raspberry PI unit with LibreELEC, and wanted to provide them with a -screen saver showing selected pictures from my selection.
- -Armed with motivation and a test photo frame, I set out to generate -a RSS feed for the Kodi instance. I adjusted my Freedombox instance, created -/var/www/html/privatepictures/, wrote a small Perl script to extract -title and description metadata from the photo files and generate the -RSS file. I ended up using Perl instead of python, as the -libimage-exiftool-perl Debian package seemed to handle the EXIF/XMP -tags I ended up using, while python3-exif did not. The relevant EXIF -tags only support ASCII, so I had to find better alternatives. XMP -seem to have the support I need.
- -I am a bit unsure which EXIF/XMP tags to use, as I would like to -use tags that can be easily added/updated using normal free software -photo managing software. I ended up using the tags set using this -exiftool command, as these tags can also be set using digiKam:
- -- --exiftool -headline='The RSS image title' \ - -description='The RSS image description.' \ - -subject+=for-family photo.jpeg -
I initially tried the "-title" and "keyword" tags, but they were -invisible in digiKam, so I changed to "-headline" and "-subject". I -use the keyword/subject 'for-family' to flag that the photo should be -shared with my family. Images with this keyword set are located and -copied into my Freedombox for the RSS generating script to find.
- -Are there better ways to do this? Get in touch if you have better -suggestions.
+ +Some years ago, in 2016, I +wrote +for the first time about the Ring peer to peer messaging system. +It would provide messaging without any central server coordinating the +system and without requiring all users to register a phone number or +own a mobile phone. Back then, I could not get it to work, and put it +aside until it had seen more development. A few days ago I decided to +give it another try, and am happy to report that this time I am able +to not only send and receive messages, but also place audio and video +calls. But only if UDP is not blocked into your network.
+ +The Ring system changed name earlier this year to +Jami. I +tried doing web search for 'ring' when I discovered it for the first +time, and can only applaud this change as it is impossible to find +something called Ring among the noise of other uses of that word. Now +you can search for 'jami' and this client and +the Jami system is the first hit at +least on duckduckgo.
+ +Jami will by default encrypt messages as well as audio and video +calls, and try to send them directly between the communicating parties +if possible. If this proves impossible (for example if both ends are +behind NAT), it will use a central SIP TURN server maintained by the +Jami project. Jami can also be a normal SIP client. If the SIP +server is unencrypted, the audio and video calls will also be +unencrypted. This is as far as I know the only case where Jami will +do anything without encryption.
+ +Jami is available for several platforms: Linux, Windows, MacOSX, +Android, iOS, and Android TV. It is included in Debian already. Jami +also work for those using F-Droid without any Google connections, +while Signal do not. +The +protocol is described in the Ring project wiki. The system uses a +distributed hash table (DHT) system (similar to BitTorrent) running +over UDP. On one of the networks I use, I discovered Jami failed to +work. I tracked this down to the fact that incoming UDP packages +going to ports 1-49999 were blocked, and the DHT would pick a random +port and end up in the low range most of the time. After talking to +the developers, I solved this by enabling the dhtproxy in the +settings, thus using TCP to talk to a central DHT proxy instead of + +peering directly with others. I've been told the developers are +working on allowing DHT to use TCP to avoid this problem. I also ran +into a problem when trying to talk to the version of Ring included in +Debian Stable (Stretch). Apparently the protocol changed between +beta2 and the current version, making these clients incompatible. +Hopefully the protocol will not be made incompatible in the +future.
+ +It is worth noting that while looking at Jami and its features, I +came across another communication platform I have not tested yet. The +Tox protocol +and family of Tox clients. It might +become the topic of a future blog post.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -197,7 +362,7 @@ activities, please send Bitcoin donations to my address
@@ -205,105 +370,92 @@ activities, please send Bitcoin donations to my addressLast night, I wrote -a -recipe to stream a Linux desktop using VLC to a instance of Kodi. -During the day I received valuable feedback, and thanks to the -suggestions I have been able to rewrite the recipe into a much simpler -approach requiring no setup at all. It is a single script that take -care of it all.
- -This new script uses GStreamer instead of VLC to capture the -desktop and stream it to Kodi. This fixed the video quality issue I -saw initially. It further removes the need to add a m3u file on the -Kodi machine, as it instead connects to -the JSON-RPC API in -Kodi and simply ask Kodi to play from the stream created using -GStreamer. Streaming the desktop to Kodi now become trivial. Copy -the script below, run it with the DNS name or IP address of the kodi -server to stream to as the only argument, and watch your screen show -up on the Kodi screen. Note, it depend on multicast on the local -network, so if you need to stream outside the local network, the -script must be modified. Also note, I have no idea if audio work, as -I only care about the picture part.
- -- --#!/bin/sh -# -# Stream the Linux desktop view to Kodi. See -# http://people.skolelinux.org/pere/blog/Streaming_the_Linux_desktop_to_Kodi_using_VLC_and_RTSP.html -# for backgorund information. - -# Make sure the stream is stopped in Kodi and the gstreamer process is -# killed if something go wrong (for example if curl is unable to find the -# kodi server). Do the same when interrupting this script. -kodicmd() { - host="$1" - cmd="$2" - params="$3" - curl --silent --header 'Content-Type: application/json' \ - --data-binary "{ \"id\": 1, \"jsonrpc\": \"2.0\", \"method\": \"$cmd\", \"params\": $params }" \ - "http://$host/jsonrpc" -} -cleanup() { - if [ -n "$kodihost" ] ; then - # Stop the playing when we end - playerid=$(kodicmd "$kodihost" Player.GetActivePlayers "{}" | - jq .result[].playerid) - kodicmd "$kodihost" Player.Stop "{ \"playerid\" : $playerid }" > /dev/null - fi - if [ "$gstpid" ] && kill -0 "$gstpid" >/dev/null 2>&1; then - kill "$gstpid" - fi -} -trap cleanup EXIT INT - -if [ -n "$1" ]; then - kodihost=$1 - shift -else - kodihost=kodi.local -fi - -mcast=239.255.0.1 -mcastport=1234 -mcastttl=1 - -pasrc=$(pactl list | grep -A2 'Source #' | grep 'Name: .*\.monitor$' | \ - cut -d" " -f2|head -1) -gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \ - videoconvert ! queue2 ! \ - x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \ - key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \ - mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \ - udpsink host=$mcast port=$mcastport ttl-mc=$mcastttl auto-multicast=1 sync=0 \ - pulsesrc device=$pasrc ! audioconvert ! queue2 ! avenc_aac ! queue2 ! mux. \ - > /dev/null 2>&1 & -gstpid=$! - -# Give stream a second to get going -sleep 1 - -# Ask kodi to start streaming using its JSON-RPC API -kodicmd "$kodihost" Player.Open \ - "{\"item\": { \"file\": \"udp://@$mcast:$mcastport\" } }" > /dev/null - -# wait for gst to end -wait "$gstpid" -
I hope you find the approach useful. I know I do.
- -As usual, if you use Bitcoin and want to show your support of my -activities, please send Bitcoin donations to my address -15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
+ +The first book I published, +Free Culture by Lawrence +Lessig, is still selling a few copies. Not a lot, but enough to +have contributed slightly over $500 to the Creative Commons Corporation +so far. All the profit is sent there. Most books are still sold via +Amazon (83 copies), with Ingram second (49) and Lulu (12) and Machette (7) as +minor channels. Bying directly from Lulu bring the largest cut to +Creative Commons. The English Edition sold 80 copies so far, the +French 59 copies, and Norwegian only 8 copies. Nothing impressive, +but nice to see the work we put down is still being appreciated. The +ebook edition is available for free from +Github.
+ +Title / language | +Quantity | ||||||
---|---|---|---|---|---|---|---|
2016 jan-jun | +2016 jul-dec | +2017 jan-jun | +2017 jul-dec | +2018 jan-jun | +2018 jul-dec | +2019 jan-may | +|
Culture Libre / French | +3 | +6 | +19 | +11 | +7 | +6 | +7 | +
Fri kultur / Norwegian | +7 | +1 | +0 | +0 | +0 | +0 | +0 | +
Free Culture / English | +14 | +27 | +16 | +9 | +3 | +7 | +3 | +
Total | +24 | +34 | +35 | +20 | +10 | +13 | +10 | +
It is fun to see the French edition being more popular than the +English one.
+ +If you would like to translate and publish the book in your native +language, I would be happy to help make it happen. Please get in +touch.
PS: See
-
A while back, I was asked by a friend how to stream the desktop to -my projector connected to Kodi. I sadly had to admit that I had no -idea, as it was a task I never had tried. Since then, I have been -looking for a way to do so, preferable without much extra software to -install on either side. Today I found a way that seem to kind of -work. Not great, but it is a start.
- -I had a look at several approaches, for example -using uPnP -DLNA as described in 2011, but it required a uPnP server, fuse and -local storage enough to store the stream locally. This is not going -to work well for me, lacking enough free space, and it would -impossible for my friend to get working.
- -Next, it occurred to me that perhaps I could use VLC to create a -video stream that Kodi could play. Preferably using -broadcast/multicast, to avoid having to change any setup on the Kodi -side when starting such stream. Unfortunately, the only recipe I -could find using multicast used the rtp protocol, and this protocol -seem to not be supported by Kodi.
- -On the other hand, the rtsp protocol is working! Unfortunately I -have to specify the IP address of the streaming machine in both the -sending command and the file on the Kodi server. But it is showing my -desktop, and thus allow us to have a shared look on the big screen at -the programs I work on.
- -I did not spend much time investigating codeces. I combined the -rtp and rtsp recipes from -the -VLC Streaming HowTo/Command Line Examples, and was able to get -this working on the desktop/streaming end.
- -- --vlc screen:// --sout \ - '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{dst=projector.local,port=1234,sdp=rtsp://192.168.11.4:8080/test.sdp}' -
I ssh-ed into my Kodi box and created a file like this with the -same IP address:
- -- --echo rtsp://192.168.11.4:8080/test.sdp \ - > /storage/videos/screenstream.m3u -
Note the 192.168.11.4 IP address is my desktops IP address. As far -as I can tell the IP must be hardcoded for this to work. In other -words, if someone elses machine is going to do the steaming, you have -to update screenstream.m3u on the Kodi machine and adjust the vlc -recipe. To get started, locate the file in Kodi and select the m3u -file while the VLC stream is running. The desktop then show up in my -big screen. :)
- -When using the same technique to stream a video file with audio, -the audio quality is really bad. No idea if the problem is package -loss or bad parameters for the transcode. I do not know VLC nor Kodi -enough to tell.
- -Update 2018-07-12: Johannes Schauer send me a few -succestions and reminded me about an important step. The "screen:" -input source is only available once the vlc-plugin-access-extra -package is installed on Debian. Without it, you will see this error -message: "VLC is unable to open the MRL 'screen://'. Check the log -for details." He further found that it is possible to drop some parts -of the VLC command line to reduce the amount of hardcoded information. -It is also useful to consider using cvlc to avoid having the VLC -window in the desktop view. In sum, this give us this command line on -the source end - -
- --cvlc screen:// --sout \ - '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{sdp=rtsp://:8080/}' -
and this on the Kodi end
- -
- --echo rtsp://192.168.11.4:8080/ \ - > /storage/videos/screenstream.m3u -
Still bad image quality, though. But I did discover that streaming -a DVD using dvdsimple:///dev/dvd as the source had excellent video and -audio quality, so I guess the issue is in the input or transcoding -parts, not the rtsp part. I've tried to change the vb and ab -parameters to use more bandwidth, but it did not make a -difference.
- -I further received a suggestion from Einar Haraldseid to try using -gstreamer instead of VLC, and this proved to work great! He also -provided me with the trick to get Kodi to use a multicast stream as -its source. By using this monstrous oneliner, I can stream my desktop -with good video quality in reasonable framerate to the 239.255.0.1 -multicast address on port 1234: - -
- --gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \ - videoconvert ! queue2 ! \ - x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \ - key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \ - mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \ - udpsink host=239.255.0.1 port=1234 ttl-mc=1 auto-multicast=1 sync=0 \ - pulsesrc device=$(pactl list | grep -A2 'Source #' | \ - grep 'Name: .*\.monitor$' | cut -d" " -f2|head -1) ! \ - audioconvert ! queue2 ! avenc_aac ! queue2 ! mux. -
and this on the Kodi end
- -
- --echo udp://@239.255.0.1:1234 \ - > /storage/videos/screenstream.m3u -
Note the trick to pick a valid pulseaudio source. It might not -pick the one you need. This approach will of course lead to trouble -if more than one source uses the same multicast port and address. -Note the ttl-mc=1 setting, which limit the multicast packages to the -local network. If the value is increased, your screen will be -broadcasted further, one network "hop" for each increase (read up on -multicast to learn more. :)!
- -Having cracked how to get Kodi to receive multicast streams, I -could use this VLC command to stream to the same multicast address. -The image quality is way better than the rtsp approach, but gstreamer -seem to be doing a better job.
- -+ +-cvlc screen:// --sout '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{mux=ts,dst=239.255.0.1,port=1234,sdp=sap}' -
Just 15 days ago,
+
I am very happy to see all of this fall into place, for use by +the +Noark 5 Tjenestegrensesnitt implementations.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -456,7 +492,7 @@ activities, please send Bitcoin donations to my address
@@ -464,112 +500,67 @@ activities, please send Bitcoin donations to my addressFive years ago, -I -measured what the most supported MIME type in Debian was, by -analysing the desktop files in all packages in the archive. Since -then, the DEP-11 AppStream system has been put into production, making -the task a lot easier. This made me want to repeat the measurement, -to see how much things changed. Here are the new numbers, for -unstable only this time: - -
Debian Unstable:
- -- count MIME type - ----- ----------------------- - 56 image/jpeg - 55 image/png - 49 image/tiff - 48 image/gif - 39 image/bmp - 38 text/plain - 37 audio/mpeg - 34 application/ogg - 33 audio/x-flac - 32 audio/x-mp3 - 30 audio/x-wav - 30 audio/x-vorbis+ogg - 29 image/x-portable-pixmap - 27 inode/directory - 27 image/x-portable-bitmap - 27 audio/x-mpeg - 26 application/x-ogg - 25 audio/x-mpegurl - 25 audio/ogg - 24 text/html -- -
The list was created like this using a sid chroot: "cat -/var/lib/apt/lists/*sid*_dep11_Components-amd64.yml.gz| zcat | awk '/^ -- \S+\/\S+$/ {print $2 }' | sort | uniq -c | sort -nr | head -20"
- -It is interesting to see how image formats have passed text/plain -as the most announced supported MIME type. These days, thanks to the -AppStream system, if you run into a file format you do not know, and -want to figure out which packages support the format, you can find the -MIME type of the file using "file --mime <filename>", and then -look up all packages announcing support for this format in their -AppStream metadata (XML or .desktop file) using "appstreamcli -what-provides mimetype <mime-type>. For example if you, like -me, want to know which packages support inode/directory, you can get a -list like this:
- -- --% appstreamcli what-provides mimetype inode/directory | grep Package: | sort -Package: anjuta -Package: audacious -Package: baobab -Package: cervisia -Package: chirp -Package: dolphin -Package: doublecmd-common -Package: easytag -Package: enlightenment -Package: ephoto -Package: filelight -Package: gwenview -Package: k4dirstat -Package: kaffeine -Package: kdesvn -Package: kid3 -Package: kid3-qt -Package: nautilus -Package: nemo -Package: pcmanfm -Package: pcmanfm-qt -Package: qweborf -Package: ranger -Package: sirikali -Package: spacefm -Package: spacefm -Package: vifm -% -
Using the same method, I can quickly discover that the Sketchup file -format is not yet supported by any package in Debian:
- -- --% appstreamcli what-provides mimetype application/vnd.sketchup.skp -Could not find component providing 'mimetype::application/vnd.sketchup.skp'. -% -
Yesterday I used it to figure out which packages support the STL 3D -format:
- -- --% appstreamcli what-provides mimetype application/sla|grep Package -Package: cura -Package: meshlab -Package: printrun -% -
PS: A new version of Cura was uploaded to Debian yesterday.
+ +A while back a college and friend from Debian and the Skolelinux / +Debian Edu project approached me, asking if I knew someone that might +be interested in helping out with a technology project he was running +as a teacher at L'école +franco-danoise - the Danish-French school and kindergarden. The +kids were building robots, rovers. The story behind it is to build a +rover for use +on +the dark side of the moon, and remote control it. As travel cost +was a bit high for the final destination, and they wanted to test the +concept first, he was looking for volunteers to host a rover for the +kids to control in a foreign country. I ended up volunteering as a +host, and last week the rover arrived. It took a while to arrive +after it was +built and shipped, because of customs confusion. Luckily we were +able fix it quickly with help from my colleges at work.
+ +This is what it looked like when the rover arrived. Note the cute +eyes looking up on me from the wrapping
+ +


Once the robot arrived, we needed to track +down batteries and figure out how to build custom firmware for it with +the appropriate wifi settings. I asked a friend if I could get two +18650 batteries from his pile of Tesla batteries (he had them from the +wrack of a crashed Tesla), so now the rover is running on Tesla +batteries.
+ +Building +the rover +firmware proved a bit harder, as the code did not work out of the +box with the Arduino IDE package in Debian Buster. I suspect this is +due to a unsolved + license problem +with arduino blocking Debian from upgrading to the latest version. +In the end we gave up debugging why the IDE failed to find the +required libraries, and ended up using the Arduino Makefile from the +arduino-mk Debian +package instead. Unfortunately the camera library is missing from +the Arduino environment in Debian, so we disabled the camera support +for the first firmware build, to get something up and running. With +this reduced firmware, the robot could be controlled via the +controller server, driving around and measuring distance using its +internal acoustic sensor.
+ +Next, With some help from my friend in Denmark, which checked in the +camera library into the gitlab repository for me to use, we were able +to build a new and more complete version of the firmware, and the +robot is now up and running. This is what the "commander" web page +look like after taking a measurement and a snapshot:
+ +
If you want to learn more about this project, you can check out the +The +Dark Side Challenge Hackaday web pages.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -578,7 +569,7 @@ activities, please send Bitcoin donations to my address
@@ -586,74 +577,64 @@ activities, please send Bitcoin donations to my addressQuite regularly, I let my Debian Sid/Unstable chroot stay untouch -for a while, and when I need to update it there is not enough free -space on the disk for apt to do a normal 'apt upgrade'. I normally -would resolve the issue by doing 'apt install <somepackages>' to -upgrade only some of the packages in one batch, until the amount of -packages to download fall below the amount of free space available. -Today, I had about 500 packages to upgrade, and after a while I got -tired of trying to install chunks of packages manually. I concluded -that I did not have the spare hours required to complete the task, and -decided to see if I could automate it. I came up with this small -script which I call 'apt-in-chunks':
- --#!/bin/sh -# -# Upgrade packages when the disk is too full to upgrade every -# upgradable package in one lump. Fetching packages to upgrade using -# apt, and then installing using dpkg, to avoid changing the package -# flag for manual/automatic. - -set -e - -ignore() { - if [ "$1" ]; then - grep -v "$1" - else - cat - fi -} + +22nd May 2019+This morning, a new release of +Nikita +Noark 5 core project was +announced +on the project mailing list. The Nikita free software solution is +an implementation of the Norwegian archive standard Noark 5 used by +government offices in Norway. These were the changes in version 0.4 +since version 0.3, see the email link above for links to a demo site:
+ +-for p in $(apt list --upgradable | ignore "$@" |cut -d/ -f1 | grep -v '^Listing...'); do - echo "Upgrading $p" - apt clean - apt install --download-only -y $p - for f in /var/cache/apt/archives/*.deb; do - if [ -e "$f" ]; then - dpkg -i /var/cache/apt/archives/*.deb - break - fi - done -done - - -
+ +The script will extract the list of packages to upgrade, try to -download the packages needed to upgrade one package, install the -downloaded packages using dpkg. The idea is to upgrade packages -without changing the APT mark for the package (ie the one recording of -the package was manually requested or pulled in as a dependency). To -use it, simply run it as root from the command line. If it fail, try -'apt install -f' to clean up the mess and run the script again. This -might happen if the new packages conflict with one of the old -packages. dpkg is unable to remove, while apt can do this.
- -It take one option, a package to ignore in the list of packages to -upgrade. The option to ignore a package is there to be able to skip -the packages that are simply too large to unpack. Today this was -'ghc', but I have run into other large packages causing similar -problems earlier (like TeX).
- -Update 2018-07-08: Thanks to Paul Wise, I am aware of two -alternative ways to handle this. The "unattended-upgrades ---minimal-upgrade-steps" option will try to calculate upgrade sets for -each package to upgrade, and then upgrade them in order, smallest set -first. It might be a better option than my above mentioned script. -Also, "aptutude upgrade" can upgrade single packages, thus avoiding -the need for using "dpkg -i" in the script above.
+- Roll out OData handling to all endpoints where applicable
+- Changed the relation key for "ny-journalpost" to the official one.
+- Better link generation on outgoing links.
+- Tidy up code and make code and approaches more consistent throughout + the codebase
+- Update rels to be in compliance with updated version in the + interface standard
+- Avoid printing links on empty objects as they can't have links
+- Small bug fixes and improvements
+- Start moving generation of outgoing links to @Service layer so access + control can be used when generating links
+- Log exception that was being swallowed so it's traceable
+- Fix name mapping problem
+- Update templated printing so templated should only be printed if it + is set true. Requires more work to roll out across entire + application.
+- Remove Record->DocumentObject as per domain model of n5v4
+- Add ability to delete lists filtered with OData
+- Return NO_CONTENT (204) on delete as per interface standard
+- Introduce support for ConstraintViolationException exception
+- Make Service classes extend NoarkService
+- Make code base respect X-Forwarded-Host, X-Forwarded-Proto and + X-Forwarded-Port
+- Update CorrespondencePart* code to be more in line with Single + Responsibility Principle
+- Make package name follow directory structure
+- Make sure Document number starts at 1, not 0
+- Fix isues discovered by FindBugs
+- Update from Date to ZonedDateTime
+- Fix wrong tablename
+- Introduce Service layer tests
+- Improvements to CorrespondencePart
+- Continued work on Class / Classificationsystem
+- Fix feature where authors were stored as storageLocations
+- Update HQL builder for OData
+- Update OData search capability from webpage
+ +If free and open standardized archiving API sound interesting to +you, please contact us on IRC +(#nikita on +irc.freenode.net) or email +(nikita-noark +mailing list).
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -662,7 +643,7 @@ activities, please send Bitcoin donations to my address
@@ -670,23 +651,50 @@ activities, please send Bitcoin donations to my address- -30th June 2018-So far, at least hydro-electric power, coal power, wind power, -solar power, and wood power are well known. Until a few days ago, I -had never heard of stone power. Then I learn about a quarry in a -mountain in -Bremanger i -Norway, where -the -Bremanger Quarry company is extracting stone and dumping the stone -into a shaft leading to its shipping harbour. This downward movement -in this shaft is used to produce electricity. In short, it is using -falling rocks instead of falling water to produce electricity, and -according to its own statements it is producing more power than it is -using, and selling the surplus electricity to the Norwegian power -grid. I find the concept truly amazing. Is this the worlds only -stone power plant?
+ +20th May 2019+As part of my involvement in the work to +standardise +a REST based API for Noark 5, the Norwegian archiving standard, I +spent some time the last few months to try to register a +MIME type +and PRONOM +code for the SOSI file format. The background is that there is a +set of formats approved for long term storage and archiving in Norway, +and among these formats, SOSI is the only format missing a MIME type +and PRONOM code.
+ +What is SOSI, you might ask? To quote Wikipedia: SOSI is short for +Samordnet Opplegg for Stedfestet Informasjon (literally "Coordinated +Approach for Spatial Information", but more commonly expanded in +English to Systematic Organization of Spatial Information). It is a +text based file format for geo-spatial vector information used in +Norway. Information about the SOSI format can be found in English +from Wikipedia. The +specification is available in Norwegian from +the +Norwegian mapping authority. The SOSI standard, which originated +in the beginning of nineteen eighties, was the inspiration and formed the +basis for the XML based +Geography +Markup Language.
+ +I have so far written +a pattern matching +rule for the file(1) unix tool to recognize SOSI files, submitted +a request to the PRONOM project to have a PRONOM ID assigned to the +format (reference TNA1555078202S60), and today send a request to IANA +to register the "text/vnd.sosi" MIME type for this format (referanse +IANA +#1143144). If all goes well, in a few months, anyone implementing +the Noark 5 Tjenestegrensesnitt API spesification should be able to +use an official MIME type and PRONOM code for SOSI files. In +addition, anyone using SOSI files on Linux should be able to +automatically recognise the format and web sites handing out SOSI +files can begin providing a more specific MIME type. So far, SOSI +files has been handed out from web sites using the +"application/octet-stream" MIME type, which is just a nice way of +stating "I do not know". Soon, we will know. :)
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -695,7 +703,7 @@ activities, please send Bitcoin donations to my address
@@ -703,57 +711,146 @@ activities, please send Bitcoin donations to my address- -26th June 2018-My movie playing setup involve Kodi, -OpenELEC (probably soon to be -replaced with LibreELEC) and an -Infocus IN76 video projector. My projector can be controlled via both -a infrared remote controller, and a RS-232 serial line. The vendor of -my projector, InFocus, had been -sensible enough to document the serial protocol in its user manual, so -it is easily available, and I used it some years ago to write -a -small script to control the projector. For a while now, I longed -for a setup where the projector was controlled by Kodi, for example in -such a way that when the screen saver went on, the projector was -turned off, and when the screen saver exited, the projector was turned -on again.
- -A few days ago, with very good help from parts of my family, I -managed to find a Kodi Add-on for controlling a Epson projector, and -got in touch with its author to see if we could join forces and make a -Add-on with support for several projectors. To my pleasure, he was -positive to the idea, and we set out to add InFocus support to his -add-on, and make the add-on suitable for the official Kodi add-on -repository.
- -The Add-on is now working (for me, at least), with a few minor -adjustments. The most important change I do relative to the master -branch in the github repository is embedding the -pyserial module in -the add-on. The long term solution is to make a "script" type -pyserial module for Kodi, that can be pulled in as a dependency in -Kodi. But until that in place, I embed it.
- -The add-on can be configured to turn on the projector when Kodi -starts, off when Kodi stops as well as turn the projector off when the -screensaver start and on when the screesaver stops. It can also be -told to set the projector source when turning on the projector. - -
If this sound interesting to you, check out -the -project github repository. Perhaps you can send patches to -support your projector too? As soon as we find time to wrap up the -latest changes, it should be available for easy installation using any -Kodi instance.
- -For future improvements, I would like to add projector model -detection and the ability to adjust the brightness level of the -projector from within Kodi. We also need to figure out how to handle -the cooling period of the projector. My projector refuses to turn on -for 60 seconds after it was turned off. This is not handled well by -the add-on at the moment.
+ +25th March 2019+As part of my involvement with the +Nikita +Noark 5 core project, I have been proposing improvements to the +API specification created by The +National Archives of Norway and helped migrating the text from a +version control system unfriendly binary format (docx) to Markdown in +git. Combined with the migration to a public git repository (on +github), this has made it possible for anyone to suggest improvement +to the text.
+ +The specification is filled with UML diagrams. I believe the +original diagrams were modelled using Sparx Systems Enterprise +Architect, and exported as EMF files for import into docx. This +approach make it very hard to track changes using a version control +system. To improve the situation I have been looking for a good text +based UML format with associated command line free software tools on +Linux and Windows, to allow anyone to send in corrections to the UML +diagrams in the specification. The tool must be text based to work +with git, and command line to be able to run it automatically to +generate the diagram images. Finally, it must be free software to +allow anyone, even those that can not accept a non-free software +license, to contribute.
+ +I did not know much about free software UML modelling tools when I +started. I have used dia and inkscape for simple modelling in the +past, but neither are available on Windows, as far as I could tell. I +came across a nice +list +of text mode uml tools, and tested out a few of the tools listed +there. The PlantUML tool seemed +most promising. After verifying that the packages +is available in +Debian and found its +Java source under a GPL license on github, I set out to test if it +could represent the diagrams we needed, ie the ones currently in +the +Noark 5 Tjenestegrensesnitt specification. I am happy to report +that it could represent them, even thought it have a few warts here +and there.
+ +After a few days of modelling I completed the task this weekend. A +temporary link to the complete set of diagrams (original and from +PlantUML) is available in +the +github issue discussing the need for a text based UML format, but +please note I lack a sensible tool to convert EMF files to PNGs, so +the "original" rendering is not as good as the original was in the +publised PDF.
+ +Here is an example UML diagram, showing the core classes for +keeping metadata about archived documents:
+ ++@startuml +skinparam classAttributeIconSize 0 + +!include media/uml-class-arkivskaper.iuml +!include media/uml-class-arkiv.iuml +!include media/uml-class-klassifikasjonssystem.iuml +!include media/uml-class-klasse.iuml +!include media/uml-class-arkivdel.iuml +!include media/uml-class-mappe.iuml +!include media/uml-class-merknad.iuml +!include media/uml-class-registrering.iuml +!include media/uml-class-basisregistrering.iuml +!include media/uml-class-dokumentbeskrivelse.iuml +!include media/uml-class-dokumentobjekt.iuml +!include media/uml-class-konvertering.iuml +!include media/uml-datatype-elektronisksignatur.iuml + +Arkivstruktur.Arkivskaper "+arkivskaper 1..*" <-o "+arkiv 0..*" Arkivstruktur.Arkiv +Arkivstruktur.Arkiv o--> "+underarkiv 0..*" Arkivstruktur.Arkiv +Arkivstruktur.Arkiv "+arkiv 1" o--> "+arkivdel 0..*" Arkivstruktur.Arkivdel +Arkivstruktur.Klassifikasjonssystem "+klassifikasjonssystem [0..1]" <--o "+arkivdel 1..*" Arkivstruktur.Arkivdel +Arkivstruktur.Klassifikasjonssystem "+klassifikasjonssystem [0..1]" o--> "+klasse 0..*" Arkivstruktur.Klasse +Arkivstruktur.Arkivdel "+arkivdel 0..1" o--> "+mappe 0..*" Arkivstruktur.Mappe +Arkivstruktur.Arkivdel "+arkivdel 0..1" o--> "+registrering 0..*" Arkivstruktur.Registrering +Arkivstruktur.Klasse "+klasse 0..1" o--> "+mappe 0..*" Arkivstruktur.Mappe +Arkivstruktur.Klasse "+klasse 0..1" o--> "+registrering 0..*" Arkivstruktur.Registrering +Arkivstruktur.Mappe --> "+undermappe 0..*" Arkivstruktur.Mappe +Arkivstruktur.Mappe "+mappe 0..1" o--> "+registrering 0..*" Arkivstruktur.Registrering +Arkivstruktur.Merknad "+merknad 0..*" <--* Arkivstruktur.Mappe +Arkivstruktur.Merknad "+merknad 0..*" <--* Arkivstruktur.Dokumentbeskrivelse +Arkivstruktur.Basisregistrering -|> Arkivstruktur.Registrering +Arkivstruktur.Merknad "+merknad 0..*" <--* Arkivstruktur.Basisregistrering +Arkivstruktur.Registrering "+registrering 1..*" o--> "+dokumentbeskrivelse 0..*" Arkivstruktur.Dokumentbeskrivelse +Arkivstruktur.Dokumentbeskrivelse "+dokumentbeskrivelse 1" o-> "+dokumentobjekt 0..*" Arkivstruktur.Dokumentobjekt +Arkivstruktur.Dokumentobjekt *-> "+konvertering 0..*" Arkivstruktur.Konvertering +Arkivstruktur.ElektroniskSignatur -[hidden]-> Arkivstruktur.Dokumentobjekt +@enduml ++ +The format is quite +compact, with little redundant information. The text expresses +entities and relations, and there is little layout related fluff. One +can reuse content by using include files, allowing for consistent +naming across several diagrams. The include files can be standalone +PlantUML too. Here is the content of +media/uml-class-arkivskaper.iuml:
+ ++@startuml +class Arkivstruktur.Arkivskaper+ +{ + +arkivskaperID : string + +arkivskaperNavn : string + +beskrivelse : string [0..1] +} +@enduml + This is what the complete diagram for the PlantUML notation above +look like:
+ ++ +
A cool feature of PlantUML is that the generated PNG files include +the entire original source diagram as text. The source (with include +statements expanded) can be extracted using for example +exiftool. Another cool feature is that parts of the entities +can be hidden after inclusion. This allow to use include files with +all attributes listed, even for UML diagrams that should not list any +attributes.
+ +The diagram also show some of the warts. Some times the layout +engine place text labels on top of each other, and some times it place +the class boxes too close to each other, not leaving room for the +labels on the relationship arrows. The former can be worked around by +placing extra newlines in the labes (ie "\n"). I did not do it here +to be able to demonstrate the issue. I have not found a good way +around the latter, so I normally try to reduce the problem by changing +from vertical to horizontal links to improve the layout.
+ +All in all, I am quite happy with PlantUML, and very impressed with +how quickly its lead developer responds to questions. So far I got an +answer to my questions in a few hours when I send an email. I +definitely recommend looking at PlantUML if you need to make UML +diagrams. Note, PlantUML can draw a lot more than class relations. +Check out the documention for a complete list. :)
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -762,7 +859,7 @@ activities, please send Bitcoin donations to my address
@@ -770,71 +867,56 @@ activities, please send Bitcoin donations to my address- -28th April 2018-I VHS-kassettenes -tid var det rett frem å ta vare på et TV-program en ønsket å kunne se -senere, uten å være avhengig av at programmet ble sendt på nytt. -Kanskje ønsket en å se programmet på hytten der det ikke var -TV-signal, eller av andre grunner ha det tilgjengelig for fremtidig -fornøyelse. Dette er blitt vanskeligere med introduksjon av -digital-TV og webstreaming, der opptak til harddisk er utenfor de -flestes kontroll hvis de bruker ufri programvare og bokser kontrollert -av andre. Men for NRK her i Norge, finnes det heldigvis flere fri -programvare-alternativer, som jeg har -skrevet -om -før. -Så lenge kilden for nedlastingen er lovlig lagt ut på nett (hvilket -jeg antar NRK gjør), så er slik lagring til privat bruk også lovlig i -Norge.
- -Sist jeg så på saken, i 2016, nevnte jeg at -youtube-dl ikke kunne -bake undertekster fra NRK inn i videofilene, og at jeg derfor -foretrakk andre alternativer. Nylig oppdaget jeg at dette har endret -seg. Fordelen med youtube-dl er at den er tilgjengelig direkte fra -Linux-distribusjoner som Debian -og Ubuntu, slik at en slipper å -finne ut selv hvordan en skal få dem til å virke.
- -For å laste ned et NRK-innslag med undertekster, og få den norske -underteksten pakket inn i videofilen, så kan følgende kommando -brukes:
- --youtube-dl --write-sub --sub-format ttml \ - --convert-subtitles srt --embed-subs \ - https://tv.nrk.no/serie/ramm-ferdig-gaa/MUHU11000316/27-04-2018 -- -URL-eksemplet er dagens toppsak på tv.nrk.no. Resultatet er en -MP4-fil med filmen og undertekster som kan spilles av med VLC. Merk -at VLC ikke viser frem undertekster før du aktiverer dem. For å gjøre -det, høyreklikk med musa i fremviservinduet, velg menyvalget for -undertekst og så norsk språk. Jeg testet også '--write-auto-sub', -men det kommandolinjeargumentet ser ikke ut til å fungere, så jeg -endte opp med settet med argumentlisten over, som jeg fant i en -feilrapport i youtube-dl-prosjektets samling over feilrapporter.
- -Denne støtten i youtube-dl gjør det svært enkelt å lagre -NRK-innslag, det være seg nyheter, filmer, serier eller dokumentater, -for å ha dem tilgjengelig for fremtidig referanse og bruk, uavhengig -av hvor lenge innslagene ligger tilgjengelig hos NRK. Så får det ikke -hjelpe at NRKs jurister mener at det er -vesensforskjellig -å legge tilgjengelig for nedlasting og for streaming, når det rent -teknisk er samme sak.
- -Programmet youtube-dl støtter også en rekke andre nettsteder, se -prosjektoversikten for -en -komplett liste.
+ +24th March 2019+@@ -849,6 +931,23 @@ komplett liste.Yesterday, a new release of +Nikita +Noark 5 core project was +announced +on the project mailing list. The free software solution is an +implementation of the Norwegian archive standard Noark 5 used by +government offices in Norway. These were the changes in version 0.3 +since version 0.2.1 (from NEWS.md):
+ ++
+ +- Improved ClassificationSystem and Class behaviour.
+- Tidied up known inconsistencies between domain model and hateaos links.
+- Added experimental code for blockchain integration.
+- Make token expiry time configurable at upstart from properties file.
+- Continued work on OData search syntax.
+- Started work on pagination for entities, partly implemented for Saksmappe.
+- Finalise ClassifiedCode Metadata entity.
+- Implement mechanism to check if authentication token is still + valid. This allow the GUI to return a more sensible message to the + user if the token is expired.
+- Reintroduce browse.html page to allow user to browse JSON API using + hateoas links.
+- Fix bug in handling file/mappe sequence number. Year change was + not properly handled.
+- Update application yml files to be in sync with current development.
+- Stop 'converting' everything to PDF using libreoffice. Only + convert the file formats doc, ppt, xls, docx, pptx, xlsx, odt, odp + and ods.
+- Continued code style fixing, making code more readable.
+- Minor bug fixes.
+ +If free and open standardized archiving API sound interesting to +you, please contact us on IRC +(#nikita on +irc.freenode.net) or email +(nikita-noark +mailing list).
+ +As usual, if you use Bitcoin and want to show your support of my +activities, please send Bitcoin donations to my address +15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.
Archive
+
- 2019 +
++ +
- January (4)
+ +- February (3)
+ +- March (3)
+ +- May (2)
+ +- June (5)
+ +- July (1)
+ +- 2018
@@ -1151,7 +1258,9 @@ komplett liste.@@ -864,7 +963,15 @@ komplett liste.
- July (5)
-- August (2)
+- August (3)
+ +- September (3)
+ +- October (5)
+ +- November (2)
+ +- December (4)
- bankid (4)
-- bitcoin (9)
+- betalkontant (8)
+ +- bitcoin (11)
- bootsystem (17)
@@ -1159,31 +1268,31 @@ komplett liste.- chrpath (2)
-- debian (161)
+- debian (168)
- debian edu (158)
- debian-handbook (4)
-- digistan (10)
+- digistan (11)
- dld (17)
-- docbook (25)
+- docbook (26)
- drivstoffpriser (4)
-- english (381)
+- english (407)
- fiksgatami (23)
-- fildeling (13)
+- fildeling (14)
-- freeculture (32)
+- freeculture (34)
- freedombox (9)
-- frikanalen (18)
+- frikanalen (20)
- h264 (20)
@@ -1191,7 +1300,9 @@ komplett liste.- isenkram (16)
-- kart (20)
+- kart (22)
+ +- kodi (4)
- ldap (9)
@@ -1205,21 +1316,23 @@ komplett liste.- mesh network (8)
-- multimedia (41)
+- multimedia (42)
+ +- nice free software (12)
-- nice free software (10)
+- noark5 (16)
-- norsk (299)
+- norsk (306)
-- nuug (190)
+- nuug (196)
-- offentlig innsyn (33)
+- offentlig innsyn (37)
- open311 (2)
-- opphavsrett (71)
+- opphavsrett (73)
-- personvern (107)
+- personvern (109)
- raid (2)
@@ -1229,27 +1342,27 @@ komplett liste.- rfid (3)
-- robot (10)
+- robot (12)
- rss (1)
-- ruter (6)
+- ruter (7)
- scraperwiki (2)
-- sikkerhet (54)
+- sikkerhet (56)
- sitesummary (4)
- skepsis (5)
-- standard (55)
+- standard (65)
- stavekontroll (6)
- stortinget (12)
-- surveillance (55)
+- surveillance (56)
- sysadmin (4)
@@ -1257,13 +1370,13 @@ komplett liste.- valg (9)
-- verkidetfri (11)
+- verkidetfri (15)
-- video (66)
+- video (73)
- vitenskap (4)
-- web (41)
+- web (42)