Samsung UE50MU6170 review

So I’ve bought a new TV (it’s actually UE50MU6172 in our country, but it doesn’t matter) after roughly 9 years to upgrade our 32” screen to 50”. And it’s Samsung again. Why? Because I saw it, tried it and it didn’t dim the screen the way I hate. Little did I know that it does not dim the screen only for PC input.

Positives

Let’s start with the good things.

  • It’s big. 50” is enough for me, my family and the room it’s in.
  • The remote is cool and works well. I even managed to set it up to control my Motorola cable TV set-top box. It may be for everyone, but as I don’t like new things easily, I like this one a lot.
  • Good price. Roughly 500€ is acceptable for me for this performance.
  • 4K. Not sure when I use it, but why not for that price.
  • I like the menu. Probably not the best possible thing, but way improved over their previous Smart solutions.
  • Fancy thing – but it has voice control that somewhat works. It switches between inputs, can start Settings, etc.
  • Picture is fine… often.

But I kinda expect TV to work, right? There is also one thing I’d expect. Samsung claims it has stunning picture and considers “dimming” to be positive. So let’s get to it.

That stupid dimming!

I bought this one because I believed it does not dim – or I can switch it off somehow. It did not dim when I tested it. What exactly am I talking about, you ask?

When the scene is dark the whole backlight goes darker too – supposedly to provide you with blacker black. Nowadays, and for many years already, Samsung prefers darker black and does not care that white is not white anymore, it’s mere grey. In normal daylight it often is not readable (when it’s text) or the scene is not discernible anymore (when it’s something happening in the night, for instance).

Many people complain about it, just Google “samsung disable dimming” or similar searches. But the situation is complicated as the dimming can be caused by various ECO features (I’ve got that covered already), how the input is treated, picture mode/style, various picture “enhancements” (mostly off is better than on), etc.

What is striking, however, is that even on official forums various Samsung moderators and officials act like they don’t know what we’re talking about. Now, I’ve got two theories for that – and I believe both are true at the same time.

Sometimes they may know what we mean by “dimming”, but they really don’t know what kind of dimming it is (the reason for it). So they have to investigate and guide you through dozens of settings that sometimes help you and sometimes not.

Other times, I think they know what we mean exactly, but they know that their engineering screwed big time and they have to cover for them, as there is no single fool-proof way how to disable it.

Or they really are clueless – I don’t know.

Now there are some hacks for that but these mostly don’t work for your model, or they are buried in some service menu that voids your warranty or what – and perhaps still may not work for your model.

Instead of guessing what mode (Gaming? PC? Movie?!) I need to use to disable that dimming there should be easy to understand setting related to “dimming” or “dynamic contrast” or something. For years Samsung has been ignoring annoyed voices on this topic. Sure, my mum will not complain, sure, many people don’t know that it doesn’t have to be this way. They don’t know that source is OK, that the telly is the problem. But there is informed minority that complains and is ignored.

Other annoying stuff

This will be probably TV for the next 10+ years and I hope it will serve well in most cases, but there are more little annoyances I noticed after just a couple of hours of usage.

  • It does not have 3.5mm audio jack. It hardly has any easily usable “legacy” audio output. I need to use headphones with TV in some situations – and for this TV I had to buy new bluetooth ones. Luckily, I learnt about it in advance, but I’d not expect it at all.
  • I experimented with the browser (app called Internet), it seems to be OK, but I don’t know how to toggle full-screen on it. Sites can do it when they are built that way, but I can’t do it.
  • Today (the second day we have it) it started to quit from settings back to Home (bar at the bottom) immediately after I choose Settings. If I manage to choose/move in the settings quickly (not easy, the UI is not very slow, but it’s not really snappy) it stays in settings. But this “auto-quit” is really annoying and it didn’t do that earlier this day and there is no obvious reason why it started. I plugged wireless keyboard with USB dongle in the meantime, but it does it even after I turn it off and unplug the USB. Even after TV off/on. I can’t find anything about this issue online.
  • YouTube app does not support keyboard – still. I don’t know who produces it (YouTube/Google? Samsung?), but it truly is terrible for searching stuff. Luckily, one can use Internet app (browser) and have normal YouTube experience there. With keyboard on your lap it feels nearly like desktop.

In overall, Samsung promotes features and new things more than it cares about usability or quality. Usability got better since their Bluray player I bought 6 years ago, for sure, but it’s not there.

Rating? 3/5

Could be more, but I really feel cheated when cheap dynamic contrast makes some scenes actually worse. Hard to see, really. I don’t want anti-features. Otherwise it’s OK TV and most of the time I’ll be probably happy. But I know there will be brightness transitions when I’ll cringe. And it will be way more often than rarely. And I still don’t know how the support will help with virtually inaccessible Settings menu.

Oh… we still really suck at creating software. And it appears in more and more devices.

EDIT day later: Unplug the TV for some time. Some settings are lost (inputs), but no big deal and Settings pops up and stays there as expected.

Should you buy Samsung?

I don’t know really. My old 32” Samsung had great picture that did not change white to gray every time the scene went mostly black/dark. For years Samsung offers are better technologies spoiled with stupid dynamic contrast solution – which I believe is more or less software bug.

Friend of mine told me: “Give LG a chance! They have superb webOS, check it out!” I didn’t listen. Now I have to just read on forums that LG has dimming too, but with option to switch it off. Perhaps LG knows better.

So, before you buy Samsung, check out LG… I guess. Or anything else for that matter.

EDIT: 12 days later

Settings bug appeared again, and it seems like TV has some issues with random commands even when I completely remove the batteries from the remote (actually I removed batteries from all my remotes to double-check).

I wasn’t sure whether it’s something with older Samsung Blu-ray player BD-E6100 but today when I wanted to play DVD there it started to blink with “Not Available” which it does when I press RC button that has no function (currently). Later in menu it acted like button down is pressed. It was stuck in this “mode”, even when I disabled all my remotes. It got better after the first press of some action on TV remote, but – strangely enough – not on player’s remote. I could even initiate this loop when I used TV remote buttons Home and Back.

Eventually I figured out that somehow TV and player talk to each other in an infinite cycle. I switched off Anynet+ HDMI-CEC setting on Blu-ray player and – voila – problem gone. Together with option to control the playback with a single RC (not a problem for me really).

However, Settings falling back to Home immediately did not disappear and this time TV is to blame. Even with all the devices off it does it. I can simulate it with keyboard with dead RC without batteries. Removing keyboard’s USB dongle does not fix it, although I don’t know whether the TV doesn’t go crazy because of it – but it shouldn’t.

So to sum it up. I’ve got older, but still not to be replaced, Blu-ray player from Samsung and TV from Samsung – and they don’t go well together. Both devices are updated. Even TV by itself isn’t that reliable when it comes to its operating system. So far I’m not going back to the shop with that monster, as I can work around both issues – but it’s annoying. It shows how complexity makes quality harder. And it seems Samsung is not ready for it.

Advertisements

The Sins of (PC) Gaming Industry

It shouldn’t have sounded like “The Sins of Solar Empire” originally. Especially because the Sins are nice example how games should be made – proving that they can be successful even without anti-piracy protection.

This is not a review of a couple of games. It is a review of ugly things even good games often don’t avoid. I don’t know why – as often it’s completely avoidable. In other cases it requires some effort, I’m aware of that. And yes, I’m talking about PC games. So this is kinda minority report.

Case 1: Mass Effect 2

I’ve just finished first Mass Effect after a long time, going on to ME2 to prep myself up for Mass Effect 3 I decided to finally buy. I haven’t because it wasn’t available on Steam, however as I have got Origin after all (Sims 2 for free was enticing) I can now proceed to the final part of the trilogy.

I have hardly anything against the first ME. Sure it wasn’t perfect and it had some bugs even casual player could encounter – but it was a solid game. ME2 has many improvements, game is more varied, more story, more variety, everything. But then there are these stupid things developers could have avoided.

Unskippable videos are the first sin. This applies to unskippable intro pictures/animations that often remind me the difference between original and pirated DVD (yes, pirated video is relevant from the second 1). Why do they annoy us with this? Over and over again? It’s as interesting as cookies warning – though here stupid legislation is to be blamed.

Related thing is that sometimes they can be skipped – but the key is totally unexpected (not Escape, but Enter like in ME1?). Sure there is a question whether any key should interrupt the video or non-interactive sequence – but at least some should.

When you want to start a new game of ME2 you enjoy the video and sequences the first time, but not necessarily the second one within a day (bug reasons). Luckily it keeps running on the background so I could write this post up in the meantime.

Hints using original keymap instead of redefined. This is a minor sin, but can be confusing a lot. This requires some effort, but I think it’s minimal and worth it. Otherwise it looks cheap and sloppy – which ME2 overall isn’t.

Inconsistent escape/back compared to ME1. When I used Esc in galaxy map it pulled me up one level. ME2 gets out of the map immediately. To go back you have to use the button on screen. Small thing. Frustrating.

Console-like controls on PC. I understand this, but it has consequences. Now a single button means “run”, “cover”, “jump across” and even “use”. That’s quite an overload. Often I want to run somewhere and I jump across sideways instead.

Lack of some keybinds. While in ME1 you could customize virtually all actions in ME2 you can’t select weapons directly. You can only go to next/previous weapon. Or just use HUD pause to do so because in real-time it’s frustrating and slow.

I’m quite surprised how playable the game still is in spite of this. I like playing the game.

Case 2: Witcher 2

Again – well known game and a good one too. But compared to Witcher 1 it suffered from couple of striking omissions.

Control customization out of the game: Lack of in-game controls customization is the major one. While I still can customize the keybinding I have to do so in a separate program when the game is not running. I don’t have the game installed now, so I don’t remember whether (or how much) this combines with “unskippable video” sin, but even without it it’s a lot of time until you nail your favourite binding. Sometimes I was wondering whether the suffering was better or worse than no customization at all.

Needless to say this game was also more consolized than the first one. But that’s a trend we probably can’t fight.

Case 3: Empire: Total War

This is another game that allows control configuration but it has its own stupid twist that couldn’t pass QA guys (or they were not listened). When you choose a key that is already bound to something the game says so… and you have to find where and rebind that other action to some other key.

So typically you probably first rebind a lot of stuff to some unused keys so you can later freely rebind the actions to the keys you really wanted. Do I need to suggest that obvious improvement here? Just unbind the other key! Or exchange the binding – although this is programmers trying to be unnecessarily smart.

Talking about that latter (suboptimal) option I now recalled Mass Effect 1 which exchanged bindings like this. By some accident it so happened that I had E key bound both to forward and something else at the same time. And I couldn’t get rid of that binding! There was no obvious way how to disable the binding and anytime I tried to replace it with another key it moved E into the secondary option. Now thinking about it, I haven’t tried to replace the secondary option as well, but the whole idea of it was ridiculous. (Secondary binding is actually neat, not that I use it that much.)

Just override what I set and unbind it from original action.

(Just in case you’re asking why E is forward – know that ESDF is superior. Hands down.)

And other cases

Other serious sins are examples of lousily and cheaply localized games with terrible translations and no option to switch both sound and text to original (mostly English). This also often affects patching. Typically original patches break something in the translated version and translated patches are not available.

I just hope many of my grudges are not relevant anymore. I have to admit I originally started this post in 2012 and I’m generally not buying 50+ bucks brand new titles anymore.

How should it look like?

I liked Witcher 1 for instance. Even though I wasn’t RPG guy – I mostly prefered first-person shooters or real-time strategies then – I really liked the game. Even though these “over-the-shoulder” games seem more clumsy compared to FPS genre the first Witcher drew me in and kept me there for a long time. It was a fine PC game.

When Epic pretty much first failed with Unreal Tournament 3 (not with its Unreal Engine though) it tried to redeem itself with UT3 Black Edition. Original UT3 had great reviews but many negative user’s reviews. UT3 Black was a bit too little too late, but it was a nice thing to do and they at least showed they cared about UT brand after all. BTW: Now Epic is making new Unreal Tournament which will be available for free. I’m curious how that plays out but it’s interesting for sure.

I’ve already mentioned Sins of a Solar Empire. It was a successful title even though it didn’t have DRM. The guys who made the game said it simply (not exact quotation though): “We’re making game for people who buy it. Our customers don’t want DRM so we don’t put it in.” This was a fresh perspective in a world where DRM systems go so far that they intentionally harm your computer system. For many years I ignored Ubisoft games for their DRM even though I wanted to buy couple of their games.

There are also other nice examples in gaming industry, examples where you see that games are true passion for someone – GOG or Good Old Games. Originally these guys prepared really old games from DOS times so that they work on modern systems with current OS. Games like Doom or Descent. And GOG too has a fair DRM-free approach.

With this I swerved from smaller (and some bigger) annoyances to a topic much more serious. Talking about GOG, they have a nice video about their optional client GOG Galaxy that pretty much sums it all up.

But why not talk about DRM? It’s perhaps the biggest sin of the industry. Spending millions on something we don’t want. Sure I played cracked game when I was younger (and with little to no money). I’m not exactly proud of it. Now I’m sure I paid back many times over. But I choose where my money go. And good will pays back.

Pre-christmas technology review 2017

It’s been two years since my last technology overview and I feel the need for another one. Especially after otherwise pretty weak blog-year.

Notebook HP Stream 14” with 32GB eMMC (“Snow White”) (verdict: 2/5)

I got this for my son to teach him some HTML or similar entry-level computer stuff. He loves doing it. However, even nearly bare pre-installed Windows is not able to update itself (Anniversary Update I guess) with over 7GB free disk. And there is no way to make it any free-er. At least not without some hacking. There is just a single local user (admin) because any other user takes additional gigabyte(s), especially with Microsoft account, OneDrive, etc. There is only a couple of apps, I’ve got Visual Studio Code (175 MB) and K-Lite Codec Pack (111 MB).

That’s it. Virtually no data except some pretty tiny HTML files we work on and some pictures (few megas). Anything else is beyond my control. I tried compressing OS. I wanted to switch to Linux, but I’ve read eMMCs are not well supported there (in general). The disk is also incredibly slow – don’t let the fact it’s a flash drive fool you. It’s not SSD. Simply said, it’s something slow connected over a super-slow interface.

I believe there are devices where it’s acceptable, but please, let’s make it official. Notebook with eMMC is just a stupid idea. Especially with 32 gigs and Windows on top of it – that should be considered a crime. Or a scam at least.

Without the problem combining Windows with eMMC, I’d take the machine as OK for the price. It’s light, thin, looks good, runs long on battery (5-8 hours no problem), bright enough display. But it simply does not update anymore. Shame.

Next day update: It didn’t boot the next time and kept finishing in the BSoD. Recovery didn’t work anymore, although I’ve never touched any recovery partition that was there. Windows bootable USB didn’t help (you need to upgrade from your booted system and install didn’t work because there wasn’t free room anymore). Ubuntu 17.10 installed, but booted to black screen – however after enabling Compatibility support module (CSM) in BIOS (F10 to get there), it booted fine.

Bluetooth speaker Bose SoundLink Mini II (5/5)

I wanted a speaker for a party and I wanted it on short notice (not that I needed it, just wanted). I was surprised how many options there is for Bluetooth speaker, but Bose’s one was available and I’d seen it couple of weeks before – and heard it too. Rather pricey just under 200 Eur, but I took the risk.

I’m not sure how much the speaker is responsible for “catching” the bluetooth signal but sometimes I had to keep it pretty close to the device (especially with my HP ProBook). I settled on using a mobile phone. What I love is both way control – I can pause or skip the song using speaker’s button. I also like how it speaks at me and it was easy to pair it with devices.

And then there is the sound. Perhaps too deep to my liking, but definitely satisfying. I expect the speaker to play nicely when quiet – that is I hear enough music details yet I can talk with people around. This is that kind of speaker. Second revision also has micro USB on the speaker itself in case you don’t have the cradle with you (or it gets broken or what) which is nice. Some say it’s not powerful enough, but for me it is more than adequate. No regrets.

Windows package manager Chocolatey (5/5)

I encountered Chocolatey when I was experimenting with Windows unattended installation for VirtualBox. Since then I use it and it’s one of the first things I install on fresh computers. Then I just check my notes about setting up Windows (warning, it’s a live document for personal use, not a blogpost) and copy paste commands to install what I want. This is what good operating system should have at its core – and in a simple matter. Sure you can add/remove things with various PowerShell commands, but the situation there is messy – some features are available only on Enterprise edition so it simply does not do what any Linux package manager does. Chocolatey is a nice plaster for this hole in Windows. Nuff said.

CSS grid layout

Shortly – go and try it. It’s awesome, especially for responsive design when combined with media queries. I first heard about it in this video (highly recommended). While playing with it I also used this tutorial. While still just a candidate recommendation it’s widely supported with recent browsers (although there are some problems with IEs where it originally came from). And you can make it fallback gracefully, just in case. I’m no expert, but this is technology not only for experts – and that’s good.

JavaScript framework ExtJS 4.x (3/5)

I encountered this JavaScript/UI framework on the most recent project and it was selected for us long time ago. It for sure is comprehensive and you can achieve a lot with it – that’s why the verdict went all the way up to 3. I simply can’t deny the results one can do with it. But for us it was too much. It all starts with incredibly steep learning curve. And you pay dearly for anything you don’t understand. Talking about paying – it’s commercial and the license is pretty expensive. It may be good for all-in ExtJS shops, but not for occasional project where people may change and they need to learn it.

Story of our project? We started using Architect, then some “raw JS” devs walked through and original Sencha workspace was unusable and it was coded by hand since then until I came and saw the resulting mess. During the development it produced off-1px glitches like missing right borders of panels until you ran the Sencha CMD (build) that fixes it – but it takes a long time. In general, the whole process is pretty heavyweight and we couldn’t figure out the way how to build the result into completely separate directory tree (e.g. target when Maven exec plugin was used) without breaking some path references.

So you probably can do a lot with it, but it’s all-in or rather not at all. It’s proprietary and I’d rather invest in some mainstream OSS framework instead. That would be better for both my career and future evolution and maintenance of the project too.

Note: Current version is 6.5.x.

Fitness band Xiaomi Mi Band 2 (4/5)

I bought this step counter (as that was the main reason I wanted it) just before previous Christmas and it worked well for the purposes I wanted it for. It counted steps. How precise it is? I personally think it doesn’t matter. For couple of days I carried some Garmin device on the same hand and it counted 12k steps where Mi Band counted 10k. But this doesn’t really matter. On your average day 10k is more than 8k, you know which one is better. On days you do something completely different you can’t compare the number with the day with different routine. But both devices counted actual steps just fine – the difference happens in between the long walks – like during typing on the computer. Mi Band also clearly differentiates between walking and running as it indicates more energy output for the same step count. So far so good, that’s what I wanted.

How good is Mi Band 2 as a “smart band”? I don’t know. I don’t use it to measure my heart rate as it doesn’t work that well while running anyway. I don’t have it constantly paired with a mobile phone either. Only occasionally I pair it with a phone with the application. Ah, the application – it runs only on newer Androids, my old Samsung Galaxy 3S was out of question. It also doesn’t run on tablets in general, neither on PC. That just sucks. I wasn’t able to connect the application and my Xiaomi account either. Instructions are unclear or unavailable or something fails – I haven’t tried since then, I simply use the app on the phone to see the history and don’t care about the bigger picture (yet). I saw recommendations to use it with different app (like Google Fit or so).

Finally – after roughly a half year the device started to fall out of the wristband. Happens to many people so it seems. I was quite lucky I always picked the device up before I bought some cheap replacements. These feel similar – they are rougher for sure but after a while I don’t feel bothered by them at all. I like they feel firmer than the original, but it’s been just 3 months yet so I’ll see whether they last longer or not. But for this price I have no problem to replace them often.

So, why not 5/5? The device was pretty expensive in Slovakia a year ago (50 Eur or so), it got into normal range later. The application is not available on many devices (not running on tablets for arbitrary reasons?!) and it didn’t connect easily (=at all for me) with an online account. Display is good but not in strong sunshine. Otherwise it does what I want – but let’s face it, I don’t want that much from it. 🙂

Amazon Prime (verdict cancelled)

OK, this was rather a hasty affair. Lasted less than an hour for me. I tried to use my amazon.com account for that – and this is the first gripe. Amazon should seriously help us to decide what account we should (or even could) use for the full benefit in a particular country. Prime in Slovakia is official only for couple of weeks (if I understand it correctly) but I have no idea whether I should use COM, co.uk or DE account for it. I tried COM then – trial is for free, shouldn’t hurt.

I’m trying a link from registration email to a movie included with the membership. Does not work in my region. So much for the confusion of the customer. They seriously should employ their AI to offer me available films. There is no chance in understanding all the limitations before you actually start the trial.

Then I tried Amazon Music – as the music is by far the most important thing for me. The site asked for billing information – why? Amazon has it already! Funny that their video site didn’t. But I filled it in… and the wait indicator just keeps spinning and spinning. Hm, after a while I reloaded and retried – the same result. I couldn’t go further without providing the info and they didn’t accept it (or reject for that matter). Terrible first experience. Anyway… as I browsed around it dawned upon me that there is this Music Prime (included) and Music Unlimited – for additional 8 Eur/month.

Is even my kind of music available on Prime? I don’t need a million of songs… couple of thousand of the right ones would be enough. Let’s check. Our newest hit in the family – Steven Wilson! Nope… Pink Floyd? Nope. Yes? Mike Oldfield? U2?! Unlimited only… I couldn’t find anything I liked on Prime!

I could as well cancel the trial period immediately. And I did. (It will run out after a month, but after that the service will be cancelled.) Shame, as I’m rather a fanboy, especially when it comes to what they do with AWS. But first the Music site must actually work, especially at at its literally first step and then offer some reasonable music for reasonable price. Total price, not add-on price.

I feel no urge to buy Echo Dot for christmas either… at least not this year. The whole landscape moves pretty fast anyway and I’m curious where it all is next December.

Converting 96kHz 24-bit FLAC to OGG with ffmpeg

Lately my son Robin asked for Peter Gabriel’s song The Tower That Ate People in a car. I like OGGs, although recently it may have been pointless with MP3 patents being expired. But 15+ years ago it was an obvious choice for me, especially because most encoded MP3 files had also clearly cut out high frequencies and generally lower quality at the same bitrate. Again – not a problem I encountered with newer MP3s. But I stayed true to OGG and I honestly don’t need anything better than its Q7 level.

The song is on Peter’s OVO album but the version Robin likes is from Back to Front show in London. So I browsed it, played it and – all the songs were skipped. Darn! I knew it must be because of the quality being very high because the digital download, companion to the Blu-ray Deluxe Book Edition (yeah, I’m a fan), was in 96kHz for both FLAC and OGG. So I had to recode the OGG, or better FLAC to OGG in normal sample rate (44.1kHz).

FFmpeg for the rescue!

I previously transcoded OGGs to MP3 for a little radio that didn’t support OGGs (I never understand why this happens) and I was very satisfied with FFmpeg because when I can do something from a command line I prefer that. So today I downloaded Windows build of FFmpeg and tried to figure out the switches.

After some Googling I tried -codec:a libvorbis and it told me there is no such a codec. So I tried ffmpeg -codecs to find out what (and if) there is any OGG support. There was just vorbis decoder, so I tried that one. Then ffmpeg told me that it’s just experimental and I must add -strict -2 switch to enable it. It worked afterwards but the warning was strange so I investigated further.

The trouble was that the build from FFmpeg site did not have libvorbis compiled in. Every time you run ffmpeg it prints the configuration it was compiled with and mine didn’t show –enable-libvorbis in the output. It was by an accident I found out I’ve got ffmpeg also on my PATH – which was strange considered I didn’t put the downloaded version there. It was part of ImageMagick which I was pretty sure was installed using Chocolatey (most recommended!), I don’t even remember why. But now it came handy, because, behold, this one had libvorbis with it!

If you have Chocolatey already, just cinst -y imagemagick and then start a new console to find ffmpeg on your path. Or do it the hard way.

Those damn spaces!

I use spaces in the filenames, replacing them with underscores or something does not make much sense, not in 21st century I believe. I respect bash (and prefer it in Windows as well, as delivered by Git) and I consider myself more or less a power-user (more less than more I guess). I understand that long time ago text was the thing and objects were not. But all this white-space escaping is sometimes killing me. Just look at all the effort that went into escaping white-spaces – IFS, quoting, print0, etc.

Actually, using NUL character (print0) as a separator seems most logical but obviously it’s difficult to put it into plain text then. But plain text is so awkward to represent anything anyway (except for the actual text). I believe some richer environment where lists are true lists is the logical thing to have. I’m not hinting on PowerShell necessarily, not sure they have it right, but they tried to be progressive for sure.

When I quote the name containing spaces on the input it’s a single argument (let’s say $1). But when I use ffmpeg -i “$1″… in the script the program complains that the first word from the filename is not a valid name. I encountered this problem many times before, passing the arguments from a command line to the script and there to other commands. Today I learned that “${1}” is different from “$1”. I always used curlies only to separate name of a variable from potentially colliding surrounding. But the first one keeps $1 as a single parameter even for another executable called from a script. Handy. Not intuitive. And definitely not something you learn in this section, for instance.

If this was all more “object-oriented” (broader meaning) it would be a filename, String or even File object from the start all the way to where it should be used. Spaces would not matter.

Sample rate and unexpected “video” stream

Because the source flac file had sampling rate of 96kHz – and I suspected this was the main reason the car audio system didn’t play it – I wanted to resample the audio to more traditional CD quality. That’s what option -ar 44100 does. Because OGG seems to have a single sample format, I didn’t have to care about bringing 24bits down to 16.

But I was surprised that my OGG didn’t play in foobar2000 and loading it actually created two entries in a playlist. I checked the output of a command more carefully and noticed it also converted some JPEG image embedded in that FLAC to a “video” stream. Not interested, thank you, said I – and that’s what -vn (no video) switch does.

And the script is…

Add setting the quality of the output OGG and -y to overwrite the output (I experimented repeatedly, you may not want it, of course) and you get a script like this:

#!/bin/sh

ffmpeg.exe -i "${1}" -ar 44100 -vn -codec:a libvorbis -qscale:a 7 -y "${1%flac}ogg"

It only encodes one file. Last thing I wanted is to treat input arguments for a for loop, although I guess I could have used shift too. Anyway, the command is easy:

find . -name \*.flac -exec ./anything-to-ogg-44k1-q7.sh {} \;

I guess it doesn’t care about the input format as long as it recognizes it, hence the “anything”. Of course, ffmpeg can do much more – I just wanted to show one recipe, that’s all.

IntelliJ IDEA and yFiles diagrams

I have a long history with IntelliJ IDEA and in overall I love it. Sure there are tons of minor (and also more than minor) issues but it’s just like with a marriage – you have to find a partner who’s flaws you can tolerate. IntelliJ IDEA is such a partner for me.

But there is a thing I really don’t understand at all – for years I hoped JetBrains reconsiders, but it seems to be a lost cause. I believe their diagramming is just plain terrible. And the reason is not how the diagrams look like – even though while some UMLs are nice most dependency diagrams generated from any bigger POM file are virtually unusable.

The worst part of the whole experience is mouse control over the diagram:

  • Mouse wheel scrolls vertically, but terribly slow. Zoom would be much better. Use Ctrl+wheel. But even if you unzoom as much as possible, you often don’t see anything because the canvas is much bigger and the diagram gets lost somewhere. Tip: Try to find it in the middle using scrollbars for orientation.
  • If you ever tried to drag the viewport (aka hand tool or something) you find out that all the intuitive ways don’t work. Even worse, they stand in your way. Use Ctrl+left mouse, BTW.
  • If you try middle mouse on its own you’ll experience the most weird thing – after you drag with middle mouse, nothing happens. But the moment you move the mouse cursor a magnifier appears on the spot where you ended dragging. No clicking makes it go away. Tip: Just press Alt, seriously. Don’t ask. With Alt you can actually move the magnifier around.

If you’ve ever seen dependency diagram in Eclipse you probably can’t understand the layout of IDEA’s diagrams – or anything else around them for that matter. In the aforementioned dependency diagrams you either see a cobweb of lines with pixel-sized boxes, or hardly more than couple of boxes you can actually read at the same time. I tried various layouts and I don’t like either really – and it’s not my visual taste, it’s how unhelpful their layout typically is.

There are some things to like, I guess. I like how you can navigate them with quick search (typing). But that’s about it. Why the visualization and layout engine is how it is and why the controls are so counter-intuitive is beyond me.

Of course, if you’re more regular user of these diagrams you’ll probably get familiar with the controls. Not sure about the layout though. I don’t know how much these flaws are related to the used commercial yFiles and how much is IDEA’s integration.

Conclusions? Don’t forget that Alt to get rid of that magnifier. 🙂 I personally believe the diagrams could have been much better and easier to navigate, perhaps even without a commercial engine. That would also enable the community around the free IntelliJ Platform to build something on top of them.

Opinionated JPA with Querydsl book finished!

I’m not sure I’ve ever had such a long pause in blogging – not that I blog that often, but still. I either didn’t want to blog about how not to do things (current project I’m working on), or made various notes for myself in GitHub markdown – or, slowly but surely, working on my book. And this post is about the book.

The book is complete

I finally finished the first edition (and perhaps the last, but not necessarily) of my first technology book named Opinionated JPA with Querydsl. I started the book on December 16th, 2015. I planned to do it sometime during 2016. But September 2017 isn’t that late after all – especially with so little happening around JPA nowadays.

When I started I imagined a book around 100 pages but the thing grew over 200 in the end. Sure, I made font a bit bigger so it’s easy to read on ebook readers even in PDF format which still seems to be superior in presentation although less flexible on small readers. And even in this volume I didn’t cover all the things that are not covered in traditional JPA/ORM books.

Don’t mess with JPA, will ye?

I have to admit that I still don’t know JPA in and out although I can navigate the specification pretty well when I need to find something. There are features I simply refused to use, but for most of these I know they don’t solve the problems I typically have. If I must put it into a single point it would be better control over generated SQL.

Now I can hear those ORM purists and I believe I understand this topic reasonably well. I’ve heard about ORM being leaky abstraction, heard why it’s bad and when it’s actually good, I’ve read many articles on the topic and worked many hours using Java ORM solutions. If you want something extensive, there is always Ted Neward’s The Vietnam of Computer Science which was written in 2006 and hardly anything is out of date.

But I don’t care about academic ideas here, ORM is real, it’s used and I actually like many of its features. The least I like its effort to hide SQL from us though. I like its type conversion when compared to very poor low-level JDBC. I can live with unit-of-work as well but there are cases when it’s simply not suitable. And then you’re left on your own.

Streaming long query straight to a file or socket? Expect out of memory if you’re querying entities that fill up your persistence context eventually, even so you don’t need them there at all. Even without persistence context, it simply tries to create the whole list first before you can work with it. No cursor, nothing. Is this really such an unexpected and rare need?

Not compliant, not knowing it

I always firmly believed that if you work with SQL database one should know SQL. Whatever blanket you put over it, ignorance is hardly ever a good thing. I wrote quite a lot of articles on JPA. I saw first-hand what happens when you consider open-session-in-view a pattern instead of what it really is (antipattern). I tacked N+1 problem in context of pagination or thought about repeating problem of mapping enums to arbitrary database values. I realize that all the theory about specification crumbles in practice when you get into crossfire of various bugs in various JPA providers. I tried to modularize single entity model (persistence unit).

However I also liked improvements in JPA 2.1 and ORM still made my life easier in most situations. When I discovered that I can actually join on arbitrary value – e.g. map a foreign key as a plain value and then use join with on clause explicitly – I was blown away. That’s when I asked myself: “Why other people don’t try it too? Why we keep fighting ins and outs of relation mappings? Why we rely on convoluted configurations or particular providers to give us lazy to-one mapping?”

And then I decided to write a book about it. There was more to it – I wanted to lump more of my rogue ideas about JPA/ORM, staying still more or less concerned user of JPA, not a hater. I also wanted to see whether I can pull it off, all the way. I wanted to see how it is to self-publish a book on something like Leanpub. I didn’t expect much of a profit from that though, I realized this is no Perennial Seller as it’s too technology related and really niche, destined to be out-of-date rather soon.

But then during writing the book while I was testing my ideas both with Hibernate and EclipseLink, I found out that Hibernate does not support JOIN on root entity (e.g. join Dog d on …), only on entity paths (e.g. join person.dog). What the… how could they miss this thing?! And then it dawned on me… this is not part of a specification. My book more or less stopped for a couple of months, but eventually went on admitting openly that I’m not JPA compliant anymore. Good thing is that Hibernate eventually joined the club and since 5.1 they support this so called “ad hoc joins”.

Here we are

I’d just like to return to abstractions we talked about previously before I end this post. Right now I’m reading Patterns of Software by Richard P. Gabriel, written in 1996. We can argue that some of the problems are solved already, but I’d not be that sure. There’s a chapter called Abstraction Descant. I found out it really relates to me. Abstractions are important tools in our arsenal, but not everything can be solved by abstraction.

After reading this I realized I care even less whether ORM is leaky abstraction or not – it should be practical and not really that much how clean or perfect abstraction it is. Especially ORM being quite a big beast. It’s not a low level where abstractions shine best – like data structures, etc. I’m not going to say more, read that part from the book and make your own mind.

So – I’ve finished the book, hopefully not in vein. Kind of a longer blog post if you will. If you’re interested but not sure about it, you can grab it for free (and you can eventually pay later if you like it and feel it helped, I’m sure it’s possible :-)).

Bratislava World Usability Day 2016 and government IT

I wrote about sustainability and design takeaways from Bratislava World Usability Day 2016 in my previous post. World Usability Day (WUD) 2016 was organized on November 10th, 2016 in many places around the world. Theme for this year was Sustainability, but for us, working with and for the public sector, it was even more attractive thanks to the guest from UK and Estonia government agencies that implement or oversee the government services – services for real people, citizens. Services that should serve – just like the state itself should. And that is very touchy topic here in Slovakia.

Videos from Bratislava event can be found here, the page is in Slovak, but videos are easy to find and are in English.

Estonia: pathfinder or e-Narnia?

Risto Hinno came to us from Estonia, the state renown for it’s level of e-government. But if you imagined their world as a perfect place with flawless services you’d be wrong. Risto came to talk about their approach to the services and the problems they had to overcome and are overcoming.

Estonia and Slovakia are both countries from the Eastern Bloc, Slovakia is the successor of Czechoslovakia, while Estonia is one of the post-Soviet states. Both states are in NATO and EU and both use Euro, but there are also some important differences. I may not be historically accurate, but while in Slovakia we still have plenty of “old guard” people in their posts (like judges) and plenty of old-thinking politicians, many of them previously members of Communist party, now often using the sticker saying “social democrat”. In Estonia most of these were Russians and they simply were gone after Estonia became independent. And that allowed for deeper change, change that is much needed here in Slovakia but haven’t happened. Some ask: “Will it ever?”

But back to the services. As Risto put it, what we (citizens) want is value, but what we typically get is bureaucracy. The answer to this problem is to make everything smaller and simpler and really focus on the value.

Problems small and big

But just as with value-vs-bureaucracy problem there are opposite forces in play here. Even when the stakeholders agree on delivering maximum value for the money they often don’t agree on how to do it.

Very often the expectations are big and the budget follows them. Very often we don’t respect the systems our users already work with. And very often we deliver little value for a lot of money afterwards. Or worse, we often make the life of our users harder and they simply can’t understand what are the new system advantages we are talking about.

It is very important to understand that we need to deliver value in small chunks. Many times in my career I’ve heard: “…but we can’t deliver this useful system in small!” Really? How do you know you can deliver it on a bigger scale then? History shows us time after time that megalomaniac plans crumble. And, to make matters worse, they crumble often over many, many years.

Managers often expect that developers can plan their work while the developers have trouble to account for all the complexity in advance – often the accidental (that is “not essential”) complexity. And the accidental complexity always gets higher with bigger system, there is simply no remedy for that. Analyse as much as you want, you find out something unexpected the minute you start coding. Or when you meet with a customer. These are truths known for decades now, but still they seemingly make no sense to many managers and other key decision makers.

And so far we’ve only talked about mismatch in beliefs how to build complex systems. What does it matter whether you want to “build it” or “let it grow”, whether you are forced to “fixed time, fixed price” contract or you can do it really incrementally using whatever agile is currently chic – this all is not important at all when the true reason to spend the money is… well, to spend the money!

Yes, public money, aka nobody’s money – who cares? People care, of course, people who are in the chain somewhere. People who decide who should participate and have some piece from that big cake – competent or not, doesn’t matter. There are always subcontractor that will do it in the end. Money talks. And value is just standing aside. Just as users and their needs do.

It can be scaled down

Of course, it can, the question is whether we dare to be accountable and flexible to deliver clear value for the money. Value that is easy to see and evaluate whether it’s worth it or not. In Estonia they are also far from perfect, but they try hard to keep it small and simple (KISS principle). They limit their evaluation/analysis projects to 50k Euro and implementation projects to 500k.

I saw people laugh at this but 500k in these countries is a reasonable cost for 8-10 person team for a year. Yes, you have to mix juniors and seniors, which is pretty normal – and no, you can’t pay for 3 more levels of management above the team. Get out of their way and they will likely deliver more value than a similar team in a typical corporate environment that has to spend 20% of their time with reporting and other overhead (and that’s a low estimate).

If the cost calculation doesn’t work for you, take less people and make the project last half a year, not full. I’m not to be convinced that there is no way to deliver visible value within 500k Euro.

Risto Hinno also mentioned another very interesting thing. They decide how many services – or how much work if you will – they want implemented at a time. This way they prevent IT market in Estonia from heating up too much because that leads to very low quality. Companies start hiring everyone and anyone, a lot of code is written by external workers who often don’t care and everything is also done at way too high pace. These are all recipes for disaster. Things they seem to know in Estonia, but not here in Slovakia.

Problems with services

Risto talked also about typical problems they faced. The learned the hard way that services must have an owner. He also presented the maturity model of the services. Using my notes and not necessarily exactly his words the levels are:

  1. ad hoc services,
  2. simple list of services is managed,
  3. services have their owners,
  4. services are measured (including their value),
  5. service management is a daily routine.

He talked about building measurement in the services. This part of the talk rang a lot of devops/continuous delivery bells. And he also talked about the future visions they have:

  • Base future services on life events. This makes them more obvious to their consumers, that is citizens.
  • Aggregated services – many simple services can collaborate to achieve more complex scenarios. Risto actually mentioned some crazy number of services they have, but also noted that many of them are really simple. Still – it’s easier to put together simple blocks than to cut and slice big blocks.
  • Link between public and IT services.

So Estonia seems to have started well and they keep improving. I wish they keep on track because I loved the ideas presented – and many of them were familiar to me. I just needed to hear that it actually somewhere works. And now it’s time to get to the next level.

Designing the next generation of government services around user needs

That was the title of the presentation by Ciara Green who came to tell us how they do it in the United Kingdom. She works for GDS, Government Digital Service and she talked about the transformation of government services that, if we simplify it, started around 2010 with quite a short letter (11 pages) by Martha Lane Fox after she was asked to oversee a review of the state of the government services at the time. Sure, 11 pages seems long for a letter, but it was short in a world where you likely get hundred(s) pages of analysis that is not to the point in the end. The letter was.

After this government services all came under a single domain gov.uk and many other good things happen. UK is way ahead of Slovakia, historically, mentally of course (despite the Brexit and all the lies leading to it) – so it doesn’t come as a surprise that they decided to focus on value and they also used current agile methodologies.

They knew what happens if you deliver over many years and then surprise your customer or users – invariably not a good surprise. So they started to deliver fast and often, tested a lot, tested with real users including seniors, focused on UX. Just as Risto, Ciara too argued for making things simple. It is very easy to do things complex and longer and we should do the opposite. We should start with needs, real world needs, remind us these often. And we should do less    (reminds me the powerful “less but better” mantra).

Another interesting point was Good services are Verbs. Bad services are names. Of course there are also other components, various registers, but in the end the focus should be on services and on the activities (e.g. life situations) they cover. Sure, the verbs are a bit unusual sometimes. One very important service is called Verify and it verifies the identity of the user with various partners (banks, Royal Mail, and more) because in UK there is no central register of citizens. So they can do this without keeping personal data (I don’t know the details) and here in Slovakia we build various registers for years and they often add more problems than they solve.

Funny enough, when she talked about some services and the name was used, it still functioned as a noun in the sentence – quite naturally. So I believe the word class used for names may not be the most important thing but using verbs may remind us what we try to solve.

Back to Slovak reality

Ciara’s talk was pure sci-fi for us. She works for government agency where they develop services in an agile way. How crazy is that? Pretty much, if you say it in Slovakia. Slovak services are portioned between many companies, most of them with political background (not official, of course), and we spent around 800 million Euro for government IT that looks like a very bad joke. Each ministry takes care of its own sandbox and if there is some initiative to unify how IT is done it is executed extremely bad.

For example, there is some central portal for public services that acts as a hub and connects various parties in government IT. However, this “service” is good mostly for the provider of the service, not for its users. The protocols are crazy complicated, if you need to connect to it (you want to or is forced to, which is more likely) you need to conform to some strict plan for testing, etc. There is no way to do it agile, it only separates you from the service you want to talk to. It adds another barrier between you and the other party, not only technically but also organizationally.

It is said that one minister mentioned to a young woman working at the ministry, horrified how the state works, that she should not be naive, that sometimes things are as they are and we have to be realistic. He, reportedly, pointed at government IT and the bad companies who suck money out of it. Now this is all a matter of speculation, but the words could have been said. The tragedy is twofold.

First: The companies do what they are allowed to do. It is not that bad companies do whatever they want, they do it with connections to the officials of the government and various bureaus. As crazy as it sounds, stories that someone who worked for some company now works for the state and manages projects his previous employer delivers. Stories like this are uncovered on virtually a daily basis now.

Second: Even if it was true and the bad companies did whatever they wanted… then the state totally failed to do its basic job. It actually did fail in the first case too, but here it seems to be a very weak state, not the state our officials depict to us.

Final words

While the Slovak reality is pretty bleak, it was very refreshing to see that it can be done and how it’s done somewhere else. It’s nice to see that agile can work, even more – it can work in a state agency. And that state agencies can deliver true value, when they really focus on it. We have also learned that state can regulate how much he wants at once. This can – and should – be done in IT, but also in infrastructure projects like motorways (another anti-example from Slovakia). It gives you better quality for lower price and surprisingly it still gets done sooner in the end!

In any case, there is a long way for Slovakia and Slovaks to get to the point when we can focus on value and don’t have to fight with elemental lack of political culture (to which I include wasting/misusing/frauding public money as well).

Neither Risto nor Ciara brought any political topics in, but some of the Slovak political “folklore” obviously affected the questions that were asked afterwards. Corruption was mentioned not once. But these were areas where our speakers couldn’t help us (oh, how I envy them).

The presented topics were so interesting for us that UX parts were often left aside a bit – although focusing on value and user from the start is pretty useful recommendation. But as with anything simple, it is much harder to do it than to do something complicated and big.