I reckon the record for the longest downtime for a computer must belong to the difference engine. From about the middle of the 19th century until the Science Museum re-built it, that’s about 150 years of downtime. My own record is a bit more modest, but more real since it is physically the same machine.
This is my Commodore Amiga, last powered on in the 1990s. Complete with 500MB GVP hard drive and Naksha mouse. As you can see it still works (although it took a few goes to get it to boot) and is rather noisy. I can’t remember how to use it, although it has a version of emacs, tex, some letters and a dial up modem connection to Demon internet and a few games I can’t remember how to play. It’s main use was Sensible Soccer so I will need to find the disks for that!
Incidentally, to close off the last post, my VPN tunnel worked!
Ever since I had my first QNAP (a TS-219 which I think came out in 2010) I’ve liked QNAPs. Apart from the odd booting problem and the frequent updates it’s been perfect, having had no hardware problems and ever increasing functions and apps.
One of those apps is a VPN server and it’s always been too fiddly to get working – until now. Currently I have an SS-439 and the software now seems mature enough to work with the minimum of fuss.
I enabled the OpenVPN application, downloaded the certificate, set up the user etc. and downloaded the OpenVPN client for Windows 7 (yeah need a new laptop veeeery soon). Having done the client side config, I tried this on the office Wifi. No luck. Wouldn’t connect at all.
First debugging step was to try it on the inside of my network (with a brief interlude to upgrade Wireshark). Same result. That took me to looking at my BT Smarthub 2 firewall. Looks like no blockers there but I decided to add a port forwarding rule just in case.
This did *something* because the behaviour of the client changed and having turned on logging for VPN on the Qnap, I could see user failed logins which corresponded to the message on the client.
I verified the username and password and then noticed that the username was case sensitive – an oldie but a goodie.
Having fixed that I connected fine – from the inside. Tomorrow the real test will be to see if I can get to it from the outside! I am hopeful! If it works, I might have a working VPN option I can use if I’m ever in China again. I spent ages trying to get a VPN working the last time I was there without success.
No, no, I’m not hitting the off switch, although you could be forgiven for thinking so given my frequency of blogs. This is in response to a recent lecture by Stuart Russell I attended at the 2019 DX Expo.
In this interesting talk, one of the topics was the off-switch problem, described on Wikipedia and no doubt in his latest book. This problem can be summarised as follows:
“A robot with a fixed objective has an incentive to disable it’s own off-switch.”
This is about who/what has control. Are humans able to turn it off if the objective does not align with ours?
The theory goes that you give the robot a positive incentive to turn itself off in situations where it determines the outcome of it’s actions are uncertain.
I have two problems with this theory.
The first is that a physical, acting in the real world “robot” is equated with the AI. This can be misleading. Robots and AI are two different concepts. It’s true of course that some or most robots will run AI s/w but it’s not true that all AI needs direct control of a physical actor to achieve it’s goals. That can be done by manipulation of data and “human engineering”. The physical presence is almost irrelevant. The problem we are trying to solve is one of control. And there’s no off switch for the internet.
That leads me to the second objection. Implementing an algorithm which means the robot turns itself off is just moving the probleml from the physical switch to the controlling algorithm. It is assumed we control the algorithm and the code. The real danger in losing control of AI is when AI s/w becomes intelligent enough to write itself. All the theory does is move the problem. Arguably to a more difficult space to solve. At best, probabilistic programming is a short term solution which only lasts as long as we control the code.
I joined the internet of things! Well, not me personally but I bought a data logger which uploads it’s data to easylogcloud.
It’s a humidity / temperate logger which I have installed next to the piano. Pianos don’t like humidity, or to be more precise, changes in humidity so I am keeping any eye on it to determine if I need a damppchaser.
Actually, the day after I wrote the last post, I managed to fix my old bricked phone. It certainly helps if you read the instructions properly. That’s what we call a picnic error (problem in chair, not in computer).
So, I can use the tool now but I haven’t yet enough courage to try newer firmware on my newer phone.
Also, I took the GCP Associate Cloud Engineer Exam on Friday and passed (at least provisionally). They send you a confirmation in about a week and I guess it would be unusual and somewhat unfair if they changed a provisional pass in to a fail!
The exam itself is 50 multiple choice questions and if you don’t know the answer 100% then it’s relatively easy to eliminate obviously wrong answers and make a good stab at the remaining 2 by carefully reading the question. In fact, careful reading of the question if the most important thing you can do – as important as revising!
My main learning resource was Linux Academy. The lectures and labs were good and their practice exam excellent preparation.
…So it’s a week later and I forgot to post this. On the plus side I can confirm Google confirmed my exam result!
Whilst I wait for another Android firmware version to download, I can pass the time by writing this.
I dipped my toe into the mysterious world of firmware upgrades on Samsung phones and have managed to brick (technical term) my test phone. This all came about because I had been using a very old Galaxy SM-J320FN hand-me down as a back-up to my iPhone. It was useful as an alternative to IOS but 8G was a bit limiting so I decided to upgrade (cheaply and after a lot of research) by buying a second hand J7 on eBay for about £90. Seems like a good deal when a new S10 costs about £900.
It’s a nice phone, the SM-G610F, with Dial SIM and 32G and a micro-SD slot. However it came with United Arab Emirates firmware and Android 6.0.1 and an older kernel and security patch level than the old J3.
So, how hard can it be, I thought, to upgrade the software? Quite hard as it turns out. I’ve discovered that Samsung phones tend to suffer from “snowflake syndrome” – no two the same. Not only do Samsung make and sell phones for certain regions, or in some cases countries, they also make phones for specific providers. Firmware is very specific to the make and country and whilst my software information with UAE firmware says it is a SM-G610F, the back of the phone describes itself as a SM-G610Y/DS. Impossible to find new firmware for.
Now, I’m not daft enough to risk bricking my new phone but I am daft enough to try upgrading the J3, just for practice you understand. This took me down a route of installing Smart Switch, Odin and Kies. There is firmware for the J3 on Sammobile so I started by downloading that using Odin3 to update the phone. It got to the last step before flashing “Fail”, a situation which has been repeated with the 3 older versions of firmware I have tried.
The phone itself tells me “Firmware upgrade encountered an issue. Please select recovery mode in Kies and try again.”. Unfortunately, Kies 3 dos not recognise the phone. In emergency recovery, it does not appear in the list and attempting to use the initialisation function results in “SM-J320FN does not support initialising”.
I see my 2017 firmware has finished downloading … let me try that…
It’s only fair to say that the upgrade to the latest BT Infinity 2 packages was pretty smooth. The service has been great since November 2017 when I switched from Virgin. Tempted by the recent advertising, I had a look to see what my options were. I was particularly interested in the wifi disc boosters as parts of the house don’t have a good signal (according to the people that lie in their beds there).
I was already on an unlimited fibre package and all the options just seemed to give the same raw upload and download speed but I thought the prospect of the latest hub and a wifi disc was worth paying a few pounds a month for. So having, placed the order, the hub and disc (only one) arrived next day.
Setting up the new hub was easier than I anticipated, just plugged it in and turned it on and it worked. The only changes I made were to the wifi name and password and the admin password. Keeping the wifi name and password the same means all the existing devices are unaware of the change. You can even send your old hub back, pre-paid (and I factory reset it first).
The disc was a bit tricker…it took several goes to pair it to the hub. Maybe I had a cable issue as it did not seem to work on one of my cables but eventually did using the cable that came with the hub. It took longer than I thought it would and much staring at what the various flashing colours mean. At one point I suspected I should have paired it before changing the admin password but you can’t change it back, at least not without a factory reset. The s/w complains if you try. That wasn’t the problem though, as it worked eventually.
In summary, it seems to have fixed the weak wifi signals and as a bonus you can even use the ethernet port on the disc to connect truculent machines like this Centos one which I never could get the wifi dongle to work on.
Imagine my delight to discover Capybara Games has produced “Below”, the 21st century version of the text based dungeon games like Rogue, Larn, Hack and Nethack I played at Uni in the 80s. I still play from time to time using an Ubuntu VM on my laptop, my current game’s level 7 looks like this:
I liked this game so much back in the day, I wrote a dungeon generator in 6502 machine code for the BBC Micro. Got stuck at that point as I had used up all the memory. In fact, I think I have some original Rogue or Larn source code on a reel of tape in the attic. Legend has it, Ken Arnold wrote Rogue to help debug his Unix curses package. The original paper is still available here.
My latest home tech purchase is a TP-Link 300Mbps Mini wireless N USB Adapter. This was intended to replace the cable I have to run every time I want to connect my server to the internet (my home wired network not working very well and involves lifting floorboards to fix).
Purchased from Maplin for £10 it looked ideal as it supported Linux. However… it supports Ubuntu, not Centos. And even for Ubuntu you need to *compile* the driver into the kernel! But that’s ok because that’s what we sign up to for Linux.
Centos is a problem though. Unless I am missing something, there seems to be no native driver for the RTL8192CU chipset and none in the elRepo repository I added for Centos 7.
Compiling the driver for Centos sounds like a bunch of work and I’m surprised no-one has done it already. I will have to do a bit more digging and maybe add it to the list of things to do.
I’ve bought some new home tech over the past few months so here is a summary of my experience.
First a top of the range Swann security system which includes 4 ultra HD cameras and a 2TB NVR plus some very long ethernet cables. Being cabled, the cameras are of course a pain to install and the size of the RJ45 plug means you normally end up drilling several large holes in your house. That can’t be avoided but what could be avoided is the very bad software that comes with the system. Basically I still haven’t been able to get it to work.
Second I bought a set of disks for an old QNAP SS 439-Pro chassis and configured them Raid 5 for 2TB of storage. This was simple to do as was copying the old QNAP to the new one. All in all the QNAP has great s/w.
So in summary, Qnap still on the “buy” list and Swann a “hold”, leaning to “sell”. Have I been working in corporate land too long?