A few days ago, I wondered if there are any privacy respecting -health monitors and/or fitness trackers available for sale these days. -I would like to buy one, but do not want to share my personal data -with strangers, nor be forced to have a mobile phone to get data out -of the unit. I've received some ideas, and would like to share them -with you. - -One interesting data point was a pointer to a Free Software app for -Android called -Gadgetbridge. -It provide cloudless collection and storing of data from a variety of -trackers. Its -list -of supported devices is a good indicator for units where the -protocol is fairly open, as it is obviously being handled by Free -Software. Other units are reportedly encrypting the collected -information with their own public key, making sure only the vendor -cloud service is able to extract data from the unit. The people -contacting me about it said they were using -Amazfit -Bip and -Xiaomi -Band 3.
- -I also got a suggestion to look at some of the units from Garmin. -I was told their GPS watches can be connected via USB and show up as a -USB storage device with -Garmin -FIT files containing the collected measurements. While -proprietary, FIT files apparently can be read at least by -GPSBabel and the -GpxPod Nextcloud -app. It is unclear to me if they can read step count and heart rate -data. The person I talked to was using a Garmin -Garmin Forerunner -935, which is a fairly expensive unit. I doubt it is worth it for -a unit where the vendor clearly is trying its best to move from open -to closed systems. I still remember when Garmin dropped NMEA support -in its GPSes.
- -A final idea was to build ones own unit, perhaps by basing it on a -wearable hardware platforms like -the Flora Geo -Watch. Sound like fun, but I had more money than time to spend on -the topic, so I suspect it will have to wait for another time.
- -While I was working on tracking down links, I came across an -inspiring TED talk by Dave Debronkart about -being a -e-patient, and discovered the web site -Participatory -Medicine. If you too want to track your own health and fitness -without having information about your private life floating around on -computers owned by others, I recommend checking it out.
+ +As part of my involvement in +the Nikita +archive API project, I've been importing a fairly large lump of +emails into a test instance of the archive to see how well this would +go. I picked a subset of my +notmuch email database, all public emails sent to me via +@lists.debian.org, giving me a set of around 216 000 emails to import. +In the process, I had a look at the various attachments included in +these emails, to figure out what to do with attachments, and noticed +that one of the most common attachment formats do not have +an +official MIME type registered with IANA/IETF. The output from +diff, ie the input for patch, is on the top 10 list of formats +included in these emails. At the moment people seem to use either +text/x-patch or text/x-diff, but neither is officially registered. It +would be better if one official MIME type were registered and used +everywhere.
+ +To try to get one official MIME type for these files, I've brought +up the topic on +the +media-types mailing list. If you are interested in discussion +which MIME type to use as the official for patch files, or involved in +making software using a MIME type for patches, perhaps you would like +to join the discussion?
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -84,7 +55,7 @@ activities, please send Bitcoin donations to my address
@@ -92,27 +63,70 @@ activities, please send Bitcoin donations to my addressDear lazyweb,
- -I wonder, is there a fitness tracker / health monitor available for -sale today that respect the users privacy? With this I mean a -watch/bracelet capable of measuring pulse rate and other -fitness/health related values (and by all means, also the correct time -and location if possible), which is only provided for -me to extract/read from the unit with computer without a radio beacon -and Internet connection. In other words, it do not depend on a cell -phone app, and do make the measurements available via other peoples -computer (aka "the cloud"). The collected data should be available -using only free software. I'm not interested in depending on some -non-free software that will leave me high and dry some time in the -future. I've been unable to find any such unit. I would like to buy -it. The ones I have seen for sale here in Norway are proud to report -that they share my health data with strangers (aka "cloud enabled"). -Is there an alternative? I'm not interested in giving money to people -requiring me to accept "privacy terms" to allow myself to measure my -own health.
+ +My current home stereo is a patchwork of various pieces I got on +flee markeds over the years. It is amazing what kind of equipment +show up there. I've been wondering for a while if it was possible to +measure how well this equipment is working together, and decided to +see how far I could get using free software. After trawling the web I +came across an article from DIY Audio and Video on +Speaker +Testing and Analysis describing how to test speakers, and it listing +several software options, among them +AUDio MEasurement +System (AUDMES). It is the only free software system I could find +focusing on measuring speakers and audio frequency response. In the +process I also found an interesting article from NOVO on +Understanding +Speaker Specifications and Frequency Response and an article from +ecoustics on +Understanding +Speaker Frequency Response, with a lot of information on what to +look for and how to interpret the graphs. Armed with this knowledge, +I set out to measure the state of my speakers.
+ +The first hurdle was that AUDMES hadn't seen a commit for 10 years +and did not build with current compilers and libraries. I got in +touch with its author, who no longer was spending time on the program +but gave me write access to the subversion repository on Sourceforge. +The end result is that now the code build on Linux and is capable of +saving and loading the collected frequency response data in CSV +format. The application is quite nice and flexible, and I was able to +select the input and output audio interfaces independently. This made +it possible to use a USB mixer as the input source, while sending +output via my laptop headphone connection. I lacked the hardware and +cabling to figure out a different way to get independent cabling to +speakers and microphone.
+ +Using this setup I could see how a large range of high frequencies +apparently were not making it out of my speakers. The picture show +the frequency response measurement of one of the speakers. Note the +frequency lines seem to be slightly misaligned, compared to the CSV +output from the program. I can not hear several of these are high +frequencies, according to measurement from +Free Hearing Test +Software, an freeware system to measure your hearing (still +looking for a free software alternative), so I do not know if they are +coming out out the speakers. I thus do not quite know how to figure +out if the missing frequencies is a problem with the microphone, the +amplifier or the speakers, but I managed to rule out the audio card in my +PC by measuring my Bose noise canceling headset using its own +microphone. This setup was able to see the high frequency tones, so +the problem with my stereo had to be in the amplifier or speakers.
+ +Anyway, to try to role out one factor I ended up picking up a new +set of speakers at a flee marked, and these work a lot better than the +old speakers, so I guess the microphone and amplifier is OK. If you +need to measure your own speakers, check out AUDMES. If more people +get involved, perhaps the project could become good enough to +include in Debian? And if +you know of some other free software to measure speakers and amplifier +performance, please let me know. I am aware of the freeware option +REW, but I want something +that can be developed also when the vendor looses interest.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -121,7 +135,7 @@ activities, please send Bitcoin donations to my address
@@ -129,66 +143,58 @@ activities, please send Bitcoin donations to my addressFor a while now, I have looked for a sensible way to share images -with my family using a self hosted solution, as it is unacceptable to -place images from my personal life under the control of strangers -working for data hoarders like Google or Dropbox. The last few days I -have drafted an approach that might work out, and I would like to -share it with you. I would like to publish images on a server under -my control, and point some Internet connected display units using some -free and open standard to the images I published. As my primary -language is not limited to ASCII, I need to store metadata using -UTF-8. Many years ago, I hoped to find a digital photo frame capable -of reading a RSS feed with image references (aka using the -<enclosure> RSS tag), but was unable to find a current supplier -of such frames. In the end I gave up that approach.
- -Some months ago, I discovered that -XScreensaver is able to -read images from a RSS feed, and used it to set up a screen saver on -my home info screen, showing images from the Daily images feed from -NASA. This proved to work well. More recently I discovered that -Kodi (both using -OpenELEC and -LibreELEC) provide the -Feedreader -screen saver capable of reading a RSS feed with images and news. For -fun, I used it this summer to test Kodi on my parents TV by hooking up -a Raspberry PI unit with LibreELEC, and wanted to provide them with a -screen saver showing selected pictures from my selection.
- -Armed with motivation and a test photo frame, I set out to generate -a RSS feed for the Kodi instance. I adjusted my Freedombox instance, created -/var/www/html/privatepictures/, wrote a small Perl script to extract -title and description metadata from the photo files and generate the -RSS file. I ended up using Perl instead of python, as the -libimage-exiftool-perl Debian package seemed to handle the EXIF/XMP -tags I ended up using, while python3-exif did not. The relevant EXIF -tags only support ASCII, so I had to find better alternatives. XMP -seem to have the support I need.
- -I am a bit unsure which EXIF/XMP tags to use, as I would like to -use tags that can be easily added/updated using normal free software -photo managing software. I ended up using the tags set using this -exiftool command, as these tags can also be set using digiKam:
- -- --exiftool -headline='The RSS image title' \ - -description='The RSS image description.' \ - -subject+=for-family photo.jpeg -
I initially tried the "-title" and "keyword" tags, but they were -invisible in digiKam, so I changed to "-headline" and "-subject". I -use the keyword/subject 'for-family' to flag that the photo should be -shared with my family. Images with this keyword set are located and -copied into my Freedombox for the RSS generating script to find.
- -Are there better ways to do this? Get in touch if you have better -suggestions.
+ +Bittorrent is as far as I know, currently the most efficient way to +distribute content on the Internet. It is used all by all sorts of +content providers, from national TV stations like +NRK, Linux distributors like +Debian and +Ubuntu, and of course the +Internet archive. + +
Almost a month ago +a new +package adding Bittorrent support to VLC became available in +Debian testing and unstable. To test it, simply install it like +this:
+ ++apt install vlc-plugin-bittorrent ++ +
Since the plugin was made available for the first time in Debian, +several improvements have been made to it. In version 2.2-4, now +available in both testing and unstable, a desktop file is provided to +teach browsers to start VLC when the user click on torrent files or +magnet links. The last part is thanks to me finally understanding +what the strange x-scheme-handler style MIME types in desktop files +are used for. By adding x-scheme-handler/magnet to the MimeType entry +in the desktop file, at least the browsers Firefox and Chromium will +suggest to start VLC when selecting a magnet URI on a web page. The +end result is that now, with the plugin installed in Buster and Sid, +one can visit any +Internet +Archive page with movies using a web browser and click on the +torrent link to start streaming the movie.
+ +Note, there is still some misfeatures in the plugin. One is the +fact that it will hang and +block VLC +from exiting until the torrent streaming starts. Another is the +fact that it +will pick +and play a random file in a multi file torrent. This is not +always the video file you want. Combined with the first it can be a +bit hard to get the video streaming going. But when it work, it seem +to do a good job.
+ +For the Debian packaging, I would love to find a good way to test +if the plugin work with VLC using autopkgtest. I tried, but do not +know enough of the inner workings of VLC to get it working. For now +the autopkgtest script is only checking if the .so file was +successfully loaded by VLC. If you have any suggestions, please +submit a patch to the Debian bug tracking system.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -197,7 +203,7 @@ activities, please send Bitcoin donations to my address
@@ -205,96 +211,71 @@ activities, please send Bitcoin donations to my addressLast night, I wrote -a -recipe to stream a Linux desktop using VLC to a instance of Kodi. -During the day I received valuable feedback, and thanks to the -suggestions I have been able to rewrite the recipe into a much simpler -approach requiring no setup at all. It is a single script that take -care of it all.
- -This new script uses GStreamer instead of VLC to capture the -desktop and stream it to Kodi. This fixed the video quality issue I -saw initially. It further removes the need to add a m3u file on the -Kodi machine, as it instead connects to -the JSON-RPC API in -Kodi and simply ask Kodi to play from the stream created using -GStreamer. Streaming the desktop to Kodi now become trivial. Copy -the script below, run it with the DNS name or IP address of the kodi -server to stream to as the only argument, and watch your screen show -up on the Kodi screen. Note, it depend on multicast on the local -network, so if you need to stream outside the local network, the -script must be modified. Also note, I have no idea if audio work, as -I only care about the picture part.
- -- --#!/bin/sh -# -# Stream the Linux desktop view to Kodi. See -# http://people.skolelinux.org/pere/blog/Streaming_the_Linux_desktop_to_Kodi_using_VLC_and_RTSP.html -# for backgorund information. - -# Make sure the stream is stopped in Kodi and the gstreamer process is -# killed if something go wrong (for example if curl is unable to find the -# kodi server). Do the same when interrupting this script. -kodicmd() { - host="$1" - cmd="$2" - params="$3" - curl --silent --header 'Content-Type: application/json' \ - --data-binary "{ \"id\": 1, \"jsonrpc\": \"2.0\", \"method\": \"$cmd\", \"params\": $params }" \ - "http://$host/jsonrpc" -} -cleanup() { - if [ -n "$kodihost" ] ; then - # Stop the playing when we end - playerid=$(kodicmd "$kodihost" Player.GetActivePlayers "{}" | - jq .result[].playerid) - kodicmd "$kodihost" Player.Stop "{ \"playerid\" : $playerid }" > /dev/null - fi - if [ "$gstpid" ] && kill -0 "$gstpid" >/dev/null 2>&1; then - kill "$gstpid" - fi -} -trap cleanup EXIT INT - -if [ -n "$1" ]; then - kodihost=$1 - shift -else - kodihost=kodi.local -fi - -mcast=239.255.0.1 -mcastport=1234 -mcastttl=1 - -pasrc=$(pactl list | grep -A2 'Source #' | grep 'Name: .*\.monitor$' | \ - cut -d" " -f2|head -1) -gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \ - videoconvert ! queue2 ! \ - x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \ - key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \ - mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \ - udpsink host=$mcast port=$mcastport ttl-mc=$mcastttl auto-multicast=1 sync=0 \ - pulsesrc device=$pasrc ! audioconvert ! queue2 ! avenc_aac ! queue2 ! mux. \ - > /dev/null 2>&1 & -gstpid=$! - -# Give stream a second to get going -sleep 1 - -# Ask kodi to start streaming using its JSON-RPC API -kodicmd "$kodihost" Player.Open \ - "{\"item\": { \"file\": \"udp://@$mcast:$mcastport\" } }" > /dev/null - -# wait for gst to end -wait "$gstpid" -
I hope you find the approach useful. I know I do.
+ +This morning, the new release of the +Nikita +Noark 5 core project was +announced +on the project mailing list. The free software solution is an +implementation of the Norwegian archive standard Noark 5 used by +government offices in Norway. These were the changes in version 0.2 +since version 0.1.1 (from NEWS.md): + +
-
+
- Fix typos in REL names +
- Tidy up error message reporting +
- Fix issue where we used Integer.valueOf(), not Integer.getInteger() +
- Change some String handling to StringBuffer +
- Fix error reporting +
- Code tidy-up +
- Fix issue using static non-synchronized SimpleDateFormat to avoid + race conditions +
- Fix problem where deserialisers were treating integers as strings +
- Update methods to make them null-safe +
- Fix many issues reported by coverity +
- Improve equals(), compareTo() and hash() in domain model +
- Improvements to the domain model for metadata classes +
- Fix CORS issues when downloading document +
- Implementation of case-handling with registryEntry and document upload +
- Better support in Javascript for OPTIONS +
- Adding concept description of mail integration +
- Improve setting of default values for GET on ny-journalpost +
- Better handling of required values during deserialisation +
- Changed tilknyttetDato (M620) from date to dateTime +
- Corrected some opprettetDato (M600) (de)serialisation errors. +
- Improve parse error reporting. +
- Started on OData search and filtering. +
- Added Contributor Covenant Code of Conduct to project. +
- Moved repository and project from Github to Gitlab. +
- Restructured repository, moved code into src/ and web/. +
- Updated code to use Spring Boot version 2. +
- Added support for OAuth2 authentication. +
- Fixed several bugs discovered by Coverity. +
- Corrected handling of date/datetime fields. +
- Improved error reporting when rejecting during deserializatoin. +
- Adjusted default values provided for ny-arkivdel, ny-mappe, + ny-saksmappe, ny-journalpost and ny-dokumentbeskrivelse. +
- Several fixes for korrespondansepart*. +
- Updated web GUI:
+
-
+
- Now handle both file upload and download. +
- Uses new OAuth2 authentication for login. +
- Forms now fetches default values from API using GET. +
- Added RFC 822 (email), TIFF and JPEG to list of possible file formats. +
+
The changes and improvements are extensive. Running diffstat on +the changes between git tab 0.1.1 and 0.2 show 1098 files changed, +108666 insertions(+), 54066 deletions(-).
+ +If free and open standardized archiving API sound interesting to +you, please contact us on IRC +(#nikita on +irc.freenode.net) or email +(nikita-noark +mailing list).
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -303,7 +284,7 @@ activities, please send Bitcoin donations to my address
@@ -311,143 +292,102 @@ activities, please send Bitcoin donations to my addressPS: See
-
A while back, I was asked by a friend how to stream the desktop to -my projector connected to Kodi. I sadly had to admit that I had no -idea, as it was a task I never had tried. Since then, I have been -looking for a way to do so, preferable without much extra software to -install on either side. Today I found a way that seem to kind of -work. Not great, but it is a start.
- -I had a look at several approaches, for example -using uPnP -DLNA as described in 2011, but it required a uPnP server, fuse and -local storage enough to store the stream locally. This is not going -to work well for me, lacking enough free space, and it would -impossible for my friend to get working.
- -Next, it occurred to me that perhaps I could use VLC to create a -video stream that Kodi could play. Preferably using -broadcast/multicast, to avoid having to change any setup on the Kodi -side when starting such stream. Unfortunately, the only recipe I -could find using multicast used the rtp protocol, and this protocol -seem to not be supported by Kodi.
- -On the other hand, the rtsp protocol is working! Unfortunately I -have to specify the IP address of the streaming machine in both the -sending command and the file on the Kodi server. But it is showing my -desktop, and thus allow us to have a shared look on the big screen at -the programs I work on.
- -I did not spend much time investigating codeces. I combined the -rtp and rtsp recipes from -the -VLC Streaming HowTo/Command Line Examples, and was able to get -this working on the desktop/streaming end.
- -- --vlc screen:// --sout \ - '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{dst=projector.local,port=1234,sdp=rtsp://192.168.11.4:8080/test.sdp}' -
I ssh-ed into my Kodi box and created a file like this with the -same IP address:
- -- --echo rtsp://192.168.11.4:8080/test.sdp \ - > /storage/videos/screenstream.m3u -
Note the 192.168.11.4 IP address is my desktops IP address. As far -as I can tell the IP must be hardcoded for this to work. In other -words, if someone elses machine is going to do the steaming, you have -to update screenstream.m3u on the Kodi machine and adjust the vlc -recipe. To get started, locate the file in Kodi and select the m3u -file while the VLC stream is running. The desktop then show up in my -big screen. :)
- -When using the same technique to stream a video file with audio, -the audio quality is really bad. No idea if the problem is package -loss or bad parameters for the transcode. I do not know VLC nor Kodi -enough to tell.
- -Update 2018-07-12: Johannes Schauer send me a few -succestions and reminded me about an important step. The "screen:" -input source is only available once the vlc-plugin-access-extra -package is installed on Debian. Without it, you will see this error -message: "VLC is unable to open the MRL 'screen://'. Check the log -for details." He further found that it is possible to drop some parts -of the VLC command line to reduce the amount of hardcoded information. -It is also useful to consider using cvlc to avoid having the VLC -window in the desktop view. In sum, this give us this command line on -the source end - -
- --cvlc screen:// --sout \ - '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{sdp=rtsp://:8080/}' -
and this on the Kodi end
- -
- --echo rtsp://192.168.11.4:8080/ \ - > /storage/videos/screenstream.m3u -
Still bad image quality, though. But I did discover that streaming -a DVD using dvdsimple:///dev/dvd as the source had excellent video and -audio quality, so I guess the issue is in the input or transcoding -parts, not the rtsp part. I've tried to change the vb and ab -parameters to use more bandwidth, but it did not make a -difference.
- -I further received a suggestion from Einar Haraldseid to try using -gstreamer instead of VLC, and this proved to work great! He also -provided me with the trick to get Kodi to use a multicast stream as -its source. By using this monstrous oneliner, I can stream my desktop -with good video quality in reasonable framerate to the 239.255.0.1 -multicast address on port 1234: - -
- --gst-launch-1.0 ximagesrc use-damage=0 ! video/x-raw,framerate=30/1 ! \ - videoconvert ! queue2 ! \ - x264enc bitrate=8000 speed-preset=superfast tune=zerolatency qp-min=30 \ - key-int-max=15 bframes=2 ! video/x-h264,profile=high ! queue2 ! \ - mpegtsmux alignment=7 name=mux ! rndbuffersize max=1316 min=1316 ! \ - udpsink host=239.255.0.1 port=1234 ttl-mc=1 auto-multicast=1 sync=0 \ - pulsesrc device=$(pactl list | grep -A2 'Source #' | \ - grep 'Name: .*\.monitor$' | cut -d" " -f2|head -1) ! \ - audioconvert ! queue2 ! avenc_aac ! queue2 ! mux. -
and this on the Kodi end
- -
- --echo udp://@239.255.0.1:1234 \ - > /storage/videos/screenstream.m3u -
Note the trick to pick a valid pulseaudio source. It might not -pick the one you need. This approach will of course lead to trouble -if more than one source uses the same multicast port and address. -Note the ttl-mc=1 setting, which limit the multicast packages to the -local network. If the value is increased, your screen will be -broadcasted further, one network "hop" for each increase (read up on -multicast to learn more. :)!
- -Having cracked how to get Kodi to receive multicast streams, I -could use this VLC command to stream to the same multicast address. -The image quality is way better than the rtsp approach, but gstreamer -seem to be doing a better job.
- -+ +-cvlc screen:// --sout '#transcode{vcodec=mp4v,acodec=mpga,vb=800,ab=128}:rtp{mux=ts,dst=239.255.0.1,port=1234,sdp=sap}' -
I have earlier covered the basics of trusted timestamping using the +'openssl ts' client. See blog post for +2014, +2016 +and +2017 +for those stories. But some times I want to integrate the timestamping +in other code, and recently I needed to integrate it into Python. +After searching a bit, I found +the +rfc3161 library which seemed like a good fit, but I soon +discovered it only worked for python version 2, and I needed something +that work with python version 3. Luckily I next came across +the rfc3161ng library, +a fork of the original rfc3161 library. Not only is it working with +python 3, it have fixed a few of the bugs in the original library, and +it has an active maintainer. I decided to wrap it up and make it +available in +Debian, and a few days ago it entered Debian unstable and testing.
+ +Using the library is fairly straight forward. The only slightly +problematic step is to fetch the required certificates to verify the +timestamp. For some services it is straight forward, while for others +I have not yet figured out how to do it. Here is a small standalone +code example based on of the integration tests in the library code:
+ ++#!/usr/bin/python3 + +""" + +Python 3 script demonstrating how to use the rfc3161ng module to +get trusted timestamps. + +The license of this code is the same as the license of the rfc3161ng +library, ie MIT/BSD. + +""" + +import os +import pyasn1.codec.der +import rfc3161ng +import subprocess +import tempfile +import urllib.request + +def store(f, data): + f.write(data) + f.flush() + f.seek(0) + +def fetch(url, f=None): + response = urllib.request.urlopen(url) + data = response.read() + if f: + store(f, data) + return data + +def main(): + with tempfile.NamedTemporaryFile() as cert_f,\ + tempfile.NamedTemporaryFile() as ca_f,\ + tempfile.NamedTemporaryFile() as msg_f,\ + tempfile.NamedTemporaryFile() as tsr_f: + + # First fetch certificates used by service + certificate_data = fetch('https://freetsa.org/files/tsa.crt', cert_f) + ca_data_data = fetch('https://freetsa.org/files/cacert.pem', ca_f) + + # Then timestamp the message + timestamper = \ + rfc3161ng.RemoteTimestamper('http://freetsa.org/tsr', + certificate=certificate_data) + data = b"Python forever!\n" + tsr = timestamper(data=data, return_tsr=True) + + # Finally, convert message and response to something 'openssl ts' can verify + store(msg_f, data) + store(tsr_f, pyasn1.codec.der.encoder.encode(tsr)) + args = ["openssl", "ts", "-verify", + "-data", msg_f.name, + "-in", tsr_f.name, + "-CAfile", ca_f.name, + "-untrusted", cert_f.name] + subprocess.check_call(args) + +if '__main__' == __name__: + main() ++ +
The code fetches the required certificates, store them as temporary +files, timestamp a simple message, store the message and timestamp to +disk and ask 'openssl ts' to verify the timestamp. A timestamp is +around 1.5 kiB in size, and should be fairly easy to store for future +use.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -456,7 +396,7 @@ activities, please send Bitcoin donations to my address
@@ -464,112 +404,59 @@ activities, please send Bitcoin donations to my addressFive years ago, -I -measured what the most supported MIME type in Debian was, by -analysing the desktop files in all packages in the archive. Since -then, the DEP-11 AppStream system has been put into production, making -the task a lot easier. This made me want to repeat the measurement, -to see how much things changed. Here are the new numbers, for -unstable only this time: - -
Debian Unstable:
- -- count MIME type - ----- ----------------------- - 56 image/jpeg - 55 image/png - 49 image/tiff - 48 image/gif - 39 image/bmp - 38 text/plain - 37 audio/mpeg - 34 application/ogg - 33 audio/x-flac - 32 audio/x-mp3 - 30 audio/x-wav - 30 audio/x-vorbis+ogg - 29 image/x-portable-pixmap - 27 inode/directory - 27 image/x-portable-bitmap - 27 audio/x-mpeg - 26 application/x-ogg - 25 audio/x-mpegurl - 25 audio/ogg - 24 text/html -- -
The list was created like this using a sid chroot: "cat -/var/lib/apt/lists/*sid*_dep11_Components-amd64.yml.gz| zcat | awk '/^ -- \S+\/\S+$/ {print $2 }' | sort | uniq -c | sort -nr | head -20"
- -It is interesting to see how image formats have passed text/plain -as the most announced supported MIME type. These days, thanks to the -AppStream system, if you run into a file format you do not know, and -want to figure out which packages support the format, you can find the -MIME type of the file using "file --mime <filename>", and then -look up all packages announcing support for this format in their -AppStream metadata (XML or .desktop file) using "appstreamcli -what-provides mimetype <mime-type>. For example if you, like -me, want to know which packages support inode/directory, you can get a -list like this:
+ +A few days, I rescued a Windows victim over to Debian. To try to +rescue the remains, I helped set up automatic sync with Google Drive. +I did not find any sensible Debian package handling this +automatically, so I rebuild the grive2 source from +the Ubuntu UPD8 PPA to do the +task and added a autostart desktop entry and a small shell script to +run in the background while the user is logged in to do the sync. +Here is a sketch of the setup for future reference.
+ +I first created ~/googledrive, entered the directory and +ran 'grive -a' to authenticate the machine/user. Next, I +created a autostart hook in ~/.config/autostart/grive.desktop +to start the sync when the user log in:
--% appstreamcli what-provides mimetype inode/directory | grep Package: | sort -Package: anjuta -Package: audacious -Package: baobab -Package: cervisia -Package: chirp -Package: dolphin -Package: doublecmd-common -Package: easytag -Package: enlightenment -Package: ephoto -Package: filelight -Package: gwenview -Package: k4dirstat -Package: kaffeine -Package: kdesvn -Package: kid3 -Package: kid3-qt -Package: nautilus -Package: nemo -Package: pcmanfm -Package: pcmanfm-qt -Package: qweborf -Package: ranger -Package: sirikali -Package: spacefm -Package: spacefm -Package: vifm -% +[Desktop Entry] +Name=Google drive autosync +Type=Application +Exec=/home/user/bin/grive-sync
Using the same method, I can quickly discover that the Sketchup file -format is not yet supported by any package in Debian:
+Finally, I wrote the ~/bin/grive-sync script to sync +~/googledrive/ with the files in Google Drive.
- --% appstreamcli what-provides mimetype application/vnd.sketchup.skp -Could not find component providing 'mimetype::application/vnd.sketchup.skp'. -% -
Yesterday I used it to figure out which packages support the STL 3D -format:
- ---% appstreamcli what-provides mimetype application/sla|grep Package -Package: cura -Package: meshlab -Package: printrun -% +#!/bin/sh +set -e +cd ~/ +cleanup() { + if [ "$syncpid" ] ; then + kill $syncpid + fi +} +trap cleanup EXIT INT QUIT +/usr/lib/grive/grive-sync.sh listen googledrive 2>&1 | sed "s%^%$0:%" & +syncpdi=$! +while true; do + if ! xhost >/dev/null 2>&1 ; then + echo "no DISPLAY, exiting as the user probably logged out" + exit 1 + fi + if [ ! -e /run/user/1000/grive-sync.sh_googledrive ] ; then + /usr/lib/grive/grive-sync.sh sync googledrive + fi + sleep 300 +done 2>&1 | sed "s%^%$0:%"
PS: A new version of Cura was uploaded to Debian yesterday.
+Feel free to use the setup if you want. It can be assumed to be +GNU GPL v2 licensed (or any later version, at your leisure), but I +doubt this code is possible to claim copyright on.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -578,7 +465,7 @@ activities, please send Bitcoin donations to my address
@@ -586,74 +473,152 @@ activities, please send Bitcoin donations to my addressQuite regularly, I let my Debian Sid/Unstable chroot stay untouch -for a while, and when I need to update it there is not enough free -space on the disk for apt to do a normal 'apt upgrade'. I normally -would resolve the issue by doing 'apt install <somepackages>' to -upgrade only some of the packages in one batch, until the amount of -packages to download fall below the amount of free space available. -Today, I had about 500 packages to upgrade, and after a while I got -tired of trying to install chunks of packages manually. I concluded -that I did not have the spare hours required to complete the task, and -decided to see if I could automate it. I came up with this small -script which I call 'apt-in-chunks':
+ +It would come as no surprise to anyone that I am interested in +bitcoins and virtual currencies. I've been keeping an eye on virtual +currencies for many years, and it is part of the reason a few months +ago, I started writing a python library for collecting currency +exchange rates and trade on virtual currency exchanges. I decided to +name the end result valutakrambod, which perhaps can be translated to +small currency shop.
+ +The library uses the tornado python library to handle HTTP and +websocket connections, and provide a asynchronous system for +connecting to and tracking several services. The code is available +from +github.
+ +There are two example clients of the library. One is very simple and +list every updated buy/sell price received from the various services. +This code is started by running bin/btc-rates and call the client code +in valutakrambod/client.py. The simple client look like this:-set -e +-#!/bin/sh -# -# Upgrade packages when the disk is too full to upgrade every -# upgradable package in one lump. Fetching packages to upgrade using -# apt, and then installing using dpkg, to avoid changing the package -# flag for manual/automatic. +import functools +import tornado.ioloop +import valutakrambod +class SimpleClient(object): + def __init__(self): + self.services = [] + self.streams = [] + pass + def newdata(self, service, pair, changed): + print("%-15s %s-%s: %8.3f %8.3f" % ( + service.servicename(), + pair[0], + pair[1], + service.rates[pair]['ask'], + service.rates[pair]['bid']) + ) + async def refresh(self, service): + await service.fetchRates(service.wantedpairs) + def run(self): + self.ioloop = tornado.ioloop.IOLoop.current() + self.services = valutakrambod.service.knownServices() + for e in self.services: + service = e() + service.subscribe(self.newdata) + stream = service.websocket() + if stream: + self.streams.append(stream) + else: + # Fetch information from non-streaming services immediately + self.ioloop.call_later(len(self.services), + functools.partial(self.refresh, service)) + # as well as regularly + service.periodicUpdate(60) + for stream in self.streams: + stream.connect() + try: + self.ioloop.start() + except KeyboardInterrupt: + print("Interrupted by keyboard, closing all connections.") + pass + for stream in self.streams: + stream.close() +
The library client loops over all known "public" services, +initialises it, subscribes to any updates from the service, checks and +activates websocket streaming if the service provide it, and if no +streaming is supported, fetches information from the service and sets +up a periodic update every 60 seconds. The output from this client +can look like this:
-ignore() { - if [ "$1" ]; then - grep -v "$1" - else - cat - fi -} +-for p in $(apt list --upgradable | ignore "$@" |cut -d/ -f1 | grep -v '^Listing...'); do - echo "Upgrading $p" - apt clean - apt install --download-only -y $p - for f in /var/cache/apt/archives/*.deb; do - if [ -e "$f" ]; then - dpkg -i /var/cache/apt/archives/*.deb - break - fi - done -done ++Bl3p BTC-EUR: 5687.110 5653.690 +Bl3p BTC-EUR: 5687.110 5653.690 +Bl3p BTC-EUR: 5687.110 5653.690 +Hitbtc BTC-USD: 6594.560 6593.690 +Hitbtc BTC-USD: 6594.560 6593.690 +Bl3p BTC-EUR: 5687.110 5653.690 +Hitbtc BTC-USD: 6594.570 6593.690 +Bitstamp EUR-USD: 1.159 1.154 +Hitbtc BTC-USD: 6594.570 6593.690 +Hitbtc BTC-USD: 6594.580 6593.690 +Hitbtc BTC-USD: 6594.580 6593.690 +Hitbtc BTC-USD: 6594.580 6593.690 +Bl3p BTC-EUR: 5687.110 5653.690 +Paymium BTC-EUR: 5680.000 5620.240 +
The exchange order book is tracked in addition to the best buy/sell +price, for those that need to know the details.
+ +The other example client is focusing on providing a curses view +with updated buy/sell prices as soon as they are received from the +services. This code is located in bin/btc-rates-curses and activated +by using the '-c' argument. Without the argument the "curses" output +is printed without using curses, which is useful for debugging. The +curses view look like this:
+ +-+ Name Pair Bid Ask Spr Ftcd Age + BitcoinsNorway BTCEUR 5591.8400 5711.0800 2.1% 16 nan 60 + Bitfinex BTCEUR 5671.0000 5671.2000 0.0% 16 22 59 + Bitmynt BTCEUR 5580.8000 5807.5200 3.9% 16 41 60 + Bitpay BTCEUR 5663.2700 nan nan% 15 nan 60 + Bitstamp BTCEUR 5664.8400 5676.5300 0.2% 0 1 1 + Bl3p BTCEUR 5653.6900 5684.9400 0.5% 0 nan 19 + Coinbase BTCEUR 5600.8200 5714.9000 2.0% 15 nan nan + Kraken BTCEUR 5670.1000 5670.2000 0.0% 14 17 60 + Paymium BTCEUR 5620.0600 5680.0000 1.1% 1 7515 nan + BitcoinsNorway BTCNOK 52898.9700 54034.6100 2.1% 16 nan 60 + Bitmynt BTCNOK 52960.3200 54031.1900 2.0% 16 41 60 + Bitpay BTCNOK 53477.7833 nan nan% 16 nan 60 + Coinbase BTCNOK 52990.3500 54063.0600 2.0% 15 nan nan + MiraiEx BTCNOK 52856.5300 54100.6000 2.3% 16 nan nan + BitcoinsNorway BTCUSD 6495.5300 6631.5400 2.1% 16 nan 60 + Bitfinex BTCUSD 6590.6000 6590.7000 0.0% 16 23 57 + Bitpay BTCUSD 6564.1300 nan nan% 15 nan 60 + Bitstamp BTCUSD 6561.1400 6565.6200 0.1% 0 2 1 + Coinbase BTCUSD 6504.0600 6635.9700 2.0% 14 nan 117 + Gemini BTCUSD 6567.1300 6573.0700 0.1% 16 89 nan + Hitbtc+BTCUSD 6592.6200 6594.2100 0.0% 0 0 0 + Kraken BTCUSD 6565.2000 6570.9000 0.1% 15 17 58 + Exchangerates EURNOK 9.4665 9.4665 0.0% 16 107789 nan + Norgesbank EURNOK 9.4665 9.4665 0.0% 16 107789 nan + Bitstamp EURUSD 1.1537 1.1593 0.5% 4 5 1 + Exchangerates EURUSD 1.1576 1.1576 0.0% 16 107789 nan + BitcoinsNorway LTCEUR 1.0000 49.0000 98.0% 16 nan nan + BitcoinsNorway LTCNOK 492.4800 503.7500 2.2% 16 nan 60 + BitcoinsNorway LTCUSD 1.0221 49.0000 97.9% 15 nan nan + Norgesbank USDNOK 8.1777 8.1777 0.0% 16 107789 nan
The script will extract the list of packages to upgrade, try to -download the packages needed to upgrade one package, install the -downloaded packages using dpkg. The idea is to upgrade packages -without changing the APT mark for the package (ie the one recording of -the package was manually requested or pulled in as a dependency). To -use it, simply run it as root from the command line. If it fail, try -'apt install -f' to clean up the mess and run the script again. This -might happen if the new packages conflict with one of the old -packages. dpkg is unable to remove, while apt can do this.
- -It take one option, a package to ignore in the list of packages to -upgrade. The option to ignore a package is there to be able to skip -the packages that are simply too large to unpack. Today this was -'ghc', but I have run into other large packages causing similar -problems earlier (like TeX).
- -Update 2018-07-08: Thanks to Paul Wise, I am aware of two -alternative ways to handle this. The "unattended-upgrades ---minimal-upgrade-steps" option will try to calculate upgrade sets for -each package to upgrade, and then upgrade them in order, smallest set -first. It might be a better option than my above mentioned script. -Also, "aptutude upgrade" can upgrade single packages, thus avoiding -the need for using "dpkg -i" in the script above.
+The code for this client is too complex for a simple blog post, so +you will have to check out the git repository to figure out how it +work. What I can tell is how the three last numbers on each line +should be interpreted. The first is how many seconds ago information +was received from the service. The second is how long ago, according +to the service, the provided information was updated. The last is an +estimate on how often the buy/sell values change.
+ +If you find this library useful, or would like to improve it, I +would love to hear from you. Note that for some of the services I've +implemented a trading API. It might be the topic of a future blog +post.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -662,7 +627,7 @@ activities, please send Bitcoin donations to my address
@@ -670,23 +635,38 @@ activities, please send Bitcoin donations to my addressSo far, at least hydro-electric power, coal power, wind power, -solar power, and wood power are well known. Until a few days ago, I -had never heard of stone power. Then I learn about a quarry in a -mountain in -Bremanger i -Norway, where -the -Bremanger Quarry company is extracting stone and dumping the stone -into a shaft leading to its shipping harbour. This downward movement -in this shaft is used to produce electricity. In short, it is using -falling rocks instead of falling water to produce electricity, and -according to its own statements it is producing more power than it is -using, and selling the surplus electricity to the Norwegian power -grid. I find the concept truly amazing. Is this the worlds only -stone power plant?
+ +Back in February, I got curious to see +if +VLC now supported Bittorrent streaming. It did not, despite the +fact that the idea and code to handle such streaming had been floating +around for years. I did however find +a standalone plugin +for VLC to do it, and half a year later I decided to wrap up the +plugin and get it into Debian. I uploaded it to NEW a few days ago, +and am very happy to report that it +entered +Debian a few hours ago, and should be available in Debian/Unstable +tomorrow, and Debian/Testing in a few days.
+ +With the vlc-plugin-bittorrent package installed you should be able +to stream videos using a simple call to
+ ++ +It can handle magnet links too. Now if only native vlc had +bittorrent support. Then a lot more would be helping each other to +share public domain and creative commons movies. The plugin need some +stability work with seeking and picking the right file in a torrent +with many files, but is already usable. Please note that the plugin +is not removing downloaded files when vlc is stopped, so it can fill +up your disk if you are not careful. Have fun. :) + ++vlc https://archive.org/download/TheGoat/TheGoat_archive.torrent +
I would love to get help maintaining this package. Get in touch if +you are interested.
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -695,7 +675,7 @@ activities, please send Bitcoin donations to my address
@@ -703,57 +683,26 @@ activities, please send Bitcoin donations to my addressMy movie playing setup involve Kodi, -OpenELEC (probably soon to be -replaced with LibreELEC) and an -Infocus IN76 video projector. My projector can be controlled via both -a infrared remote controller, and a RS-232 serial line. The vendor of -my projector, InFocus, had been -sensible enough to document the serial protocol in its user manual, so -it is easily available, and I used it some years ago to write -a -small script to control the projector. For a while now, I longed -for a setup where the projector was controlled by Kodi, for example in -such a way that when the screen saver went on, the projector was -turned off, and when the screen saver exited, the projector was turned -on again.
- -A few days ago, with very good help from parts of my family, I -managed to find a Kodi Add-on for controlling a Epson projector, and -got in touch with its author to see if we could join forces and make a -Add-on with support for several projectors. To my pleasure, he was -positive to the idea, and we set out to add InFocus support to his -add-on, and make the add-on suitable for the official Kodi add-on -repository.
- -The Add-on is now working (for me, at least), with a few minor -adjustments. The most important change I do relative to the master -branch in the github repository is embedding the -pyserial module in -the add-on. The long term solution is to make a "script" type -pyserial module for Kodi, that can be pulled in as a dependency in -Kodi. But until that in place, I embed it.
- -The add-on can be configured to turn on the projector when Kodi -starts, off when Kodi stops as well as turn the projector off when the -screensaver start and on when the screesaver stops. It can also be -told to set the projector source when turning on the projector. - -
If this sound interesting to you, check out -the -project github repository. Perhaps you can send patches to -support your projector too? As soon as we find time to wrap up the -latest changes, it should be available for easy installation using any -Kodi instance.
- -For future improvements, I would like to add projector model -detection and the ability to adjust the brightness level of the -projector from within Kodi. We also need to figure out how to handle -the cooling period of the projector. My projector refuses to turn on -for 60 seconds after it was turned off. This is not handled well by -the add-on at the moment.
+ +I continue to explore my Kodi installation, and today I wanted to +tell it to play a youtube URL I received in a chat, without having to +insert search terms using the on-screen keyboard. After searching the +web for API access to the Youtube plugin and testing a bit, I managed +to find a recipe that worked. If you got a kodi instance with its API +available from http://kodihost/jsonrpc, you can try the following to +have check out a nice cover band.
+ ++ +curl --silent --header 'Content-Type: application/json' \ + --data-binary '{ "id": 1, "jsonrpc": "2.0", "method": "Player.Open", + "params": {"item": { "file": + "plugin://plugin.video.youtube/play/?video_id=LuRGVM9O0qg" } } }' \ + http://projector.local/jsonrpc
I've extended kodi-stream program to take a video source as its +first argument. It can now handle direct video links, youtube links +and 'desktop' to stream my desktop to Kodi. It is almost like a +Chromecast. :)
As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address @@ -762,7 +711,7 @@ activities, please send Bitcoin donations to my address
@@ -770,71 +719,19 @@ activities, please send Bitcoin donations to my addressI VHS-kassettenes -tid var det rett frem å ta vare på et TV-program en ønsket å kunne se -senere, uten å være avhengig av at programmet ble sendt på nytt. -Kanskje ønsket en å se programmet på hytten der det ikke var -TV-signal, eller av andre grunner ha det tilgjengelig for fremtidig -fornøyelse. Dette er blitt vanskeligere med introduksjon av -digital-TV og webstreaming, der opptak til harddisk er utenfor de -flestes kontroll hvis de bruker ufri programvare og bokser kontrollert -av andre. Men for NRK her i Norge, finnes det heldigvis flere fri -programvare-alternativer, som jeg har -skrevet -om -før. -Så lenge kilden for nedlastingen er lovlig lagt ut på nett (hvilket -jeg antar NRK gjør), så er slik lagring til privat bruk også lovlig i -Norge.
- -Sist jeg så på saken, i 2016, nevnte jeg at -youtube-dl ikke kunne -bake undertekster fra NRK inn i videofilene, og at jeg derfor -foretrakk andre alternativer. Nylig oppdaget jeg at dette har endret -seg. Fordelen med youtube-dl er at den er tilgjengelig direkte fra -Linux-distribusjoner som Debian -og Ubuntu, slik at en slipper å -finne ut selv hvordan en skal få dem til å virke.
- -For å laste ned et NRK-innslag med undertekster, og få den norske -underteksten pakket inn i videofilen, så kan følgende kommando -brukes:
- --youtube-dl --write-sub --sub-format ttml \ - --convert-subtitles srt --embed-subs \ - https://tv.nrk.no/serie/ramm-ferdig-gaa/MUHU11000316/27-04-2018 -- -
URL-eksemplet er dagens toppsak på tv.nrk.no. Resultatet er en -MP4-fil med filmen og undertekster som kan spilles av med VLC. Merk -at VLC ikke viser frem undertekster før du aktiverer dem. For å gjøre -det, høyreklikk med musa i fremviservinduet, velg menyvalget for -undertekst og så norsk språk. Jeg testet også '--write-auto-sub', -men det kommandolinjeargumentet ser ikke ut til å fungere, så jeg -endte opp med settet med argumentlisten over, som jeg fant i en -feilrapport i youtube-dl-prosjektets samling over feilrapporter.
- -Denne støtten i youtube-dl gjør det svært enkelt å lagre -NRK-innslag, det være seg nyheter, filmer, serier eller dokumentater, -for å ha dem tilgjengelig for fremtidig referanse og bruk, uavhengig -av hvor lenge innslagene ligger tilgjengelig hos NRK. Så får det ikke -hjelpe at NRKs jurister mener at det er -vesensforskjellig -å legge tilgjengelig for nedlasting og for streaming, når det rent -teknisk er samme sak.
- -Programmet youtube-dl støtter også en rekke andre nettsteder, se -prosjektoversikten for -en -komplett liste.
+ +It might seem obvious that software created using tax money should +be available for everyone to use and improve. Free Software +Foundation Europe recentlystarted a campaign to help get more people +to understand this, and I just signed the petition on +Public Money, Public Code to help +them. I hope you too will do the same.