Category Archives: Hardware

Am I a grumpy old git, or are we trying to kid the next generation that coding is really easy?

Someone coding

I’ve been a programmer for over 30 years. That’s not a boastful statement (at least it’s not meant to be) I’m simply saying that I’ve clearly been around the block a few times. This article is about coding and specifically why we’re pretending to young people that’s it’s an easy thing to do.

So the purpose of this post is to complain, in a somewhat grumpy-old-git kind of fashion, about initiatives like “The Year of Code” and, perhaps to a lesser extent, the “BBC Micro Bit”. But more about why I want to whinge about those in a moment. First of all, a history lesson.

A life of code
As a developer, it’s important to speak lots of languages. Not the ones where you battle with a phrase book to order a cup of coffee in a foreign land and end up asking to sleep with the barista’s sister. No, I’m talking about coding languages.

I started off, as many do, with BASIC. Beginner’s All [Purpose] Symbolic Instruction Code. It’s deliberately simple to use and understand (it’s not called BASIC for nothing) and has always been a good way for budding coders to discover what communicating with a computer is all about.

My BASIC experience started on a Sinclair ZX-81. It had 1KB of RAM, although I plugged in a rather wobbly 16KB ram pack to boost this up by quite a margin. In those days, all you had to look at was a flashing cursor on a black and white screen and there was no audio. You learned very quickly that coding can, at times, be quite difficult; that you need to be patient and methodical and instant gratification is unlikely until you’ve committed to several hours of work and solid debugging. It’s a salutary lesson in the kind of personal attributes that you require in order to be a successful programmer. If you’re the kind of person who gets frustrated easily, then coding won’t suit you. Surely, as we’ll discuss in a moment, it’s a good idea to find that out sooner rather than later?

I found that I loved coding. It didn’t always go to plan. I couldn’t always get the computer to do what I wanted it to. But that didn’t matter to me. It just made me want to discover more so that I could make the device dance to my tune.

From BASIC I went through a number of languages on different systems. Some I used for a while, some I still use parts of and yet more are long forgotten. I wrote an integrated Office suite from scratch in Pascal for my A Level Computing Science. On my Commodore Amiga I dabbled in an interesting coding language called FORTH (I wonder what happened to that). And, of course, I’ve used C, C++ and in more recent years Objective C, Swift, PHP and more database and web based technologies than seems sensible to list here without boring you ridged.

Each language has its own peculiarities, advantages and disadvantages. Sometimes you have no choice what language to use. For example, for many years Objective C was (almost) the only answer for iOS and Mac OS X development. That changed in 2014 and I’ve recently started using Apple’s new Swift language and am really enjoying it. One of the attributes I mentioned earlier is being open to change like this. The shifting sands of technology. Sometimes those sands will shift so quickly that you’ll have to change the way that you work completely. That’s just the way it is in coding, and it keeps us geeks on our toes!

After those 30 years (and then some) I’m still discovering, learning, failing (often) and debugging constantly. And I love it. Because I’m the right kind of person to be a programmer.

Year Of Code

So time for my moan
There’s an initiative that’s been running for a while now called “Year of Code”. I don’t claim to know much about it, but what I do know is that it encourages young people to get into computer programming. Or at least that’s its intention. There are other initiatives, such as the recently launched BBC Micro Bit, that aim to do similar things.

Let’s get something straight from the start – I have no problem at all with trying to get people to code. In fact, I think it’s essential that we do; my complaint is how we seem to be going about it.

In 2014, The Year of Code with backing from the British government appointed a figurehead for the UK’s campaign: Lottie Dexter. She’s one of those bubbly “can do” sorts that, I guess, the people in charge thought would appeal to the media. I have nothing against Ms Dexter and I don’t blame her entirely for the situation that she found herself in, however it was still mind bogglingly vomit inducing. I saw her interviewed on BBC Newsnight one evening where it turned out that she didn’t know anything about coding. She didn’t even know what it was. She was planning to “learn it over the next year” and claimed that teachers could “be trained how to educate students in computer programming in a day”.

Clearly, no-one had bothered to check if Ms Dexter had the necessary skill set to be a coder, because (obviously) anyone can do it. Anyone can be an Ada Lovelace or an Alan Turing.

Thanks very much for belittling my profession and hard earned years of experience. No, really, thanks.

This, in a nutshell, is what annoys me about initiatives like “Year of Code”. The total lack of understanding and the failure to accept that programming takes a certain type of person and many years to perfect (if, indeed, one ever does). Like every single profession, programming is the right job for some people and the wrong job for everyone else. These schemes all seem to think that absolutely everyone can code.

I’ve got news for you: They can’t.

And there’s more…
Around the same time as Lottie-gate, I remember watching an interview with a child, no more than about 10 years old. The interviewer was saying enthusiastically that the little darling was a programmer and had written their own game. “Isn’t it amazing!” gushed the TV airhead. Well, no it wasn’t amazing really. The “programming” in question was an environment with some elements that were dragged on to a virtual canvas and then you keyed a value into a handful of boxes on screen to decide what the “program” would do. No coding was harmed (or written, for that matter) in the making of that “game”. What that had proved, was that the child could type a number and use a mouse. That hardly constitutes coding in my book.

Once again, this is an example of sending a signal to children that coding involves a simple bit of drag and drop or, in the case of the BBC Micro Bit, a few (very) simple commands in order to make some lights flash. The BBC Bite Size page on coding ,incidentally, doesn’t even spell “program” correctly, but I digress (not for the first time in this article).

So what should they be doing?
What  really needs to be done is to demonstrate to children what programming really is, warts and all. I’m not suggesting that an 8-year old should be subjected to 6 weeks of intensive C++ object orientation techniques; however the use of fundamentals in coding in schools in order to find those who have the aptitude for it would be no bad thing. In all subjects you start with the basics.

Take physics, for example. Those in year 6 won’t be discussing the plausibility of multiple universes or the finer points of string theory; they’ll be playing with weights and discovering the fundamental principles of the laws of gravity. Similarly with coding, a language like LOGO would be a good introduction but alongside the basics of binary and solving logic problems. It’s not rocket science. Those who like it will be open to the more complex sides of coding while being under no illusion about what it’s really going to be like as a profession.

In summary
You may think that I don’t want young people to learn coding because they’ll take my job away from me. That couldn’t be further from the truth. As we become ever more reliant on technology, there will be plenty of coding opportunities for all of us. However my point is that coding isn’t something that just anyone can do, and no amount of poorly thought out initiatives are going to change that.

Coding is time consuming, it often involves working to tight deadlines, and it’s not unusual for projects to change completely during their development. Don’t be surprised if you’ve spent hours working on something only to find it canned and replaced by a new requirement at the eleventh hour. It happens. Spending many hours looking for one tiny bug that might be down to a single comma or a full stop isn’t unusual; you need to have the patience to have days that go like that. However, to see a web page, a computer or a tablet do something just because I’ve programmed it to work in that specific way never gets boring and there are plenty of other devices out there that need coding in order for them to come to life.

Just because I code for a living, it doesn’t mean that I no longer code as a hobby. I was looking for a gadget recently and couldn’t find the exact spec I wanted, so while the majority of people would prefer to purchase a finished product, I’ve decided to build and program one myself most likely based on a Raspberry Pi core. I’ll do it because I can. I’ll do it because I have the right kind of logical brain that will happily work through all the inevitable problems to get to a working solution. That’s the joy of coding for me. It won’t be easy, but it will be very rewarding.

And that’s the idea that we need to sell to those children who have the aptitude to do likewise.

I’m now posting my articles both on my blog and also to Medium. You can read this article on Medium here. Please follow me on Medium here.

Introduction to the Apple Watch

The Apple Watch launched in April 2015, so this blog post and video recorded in January 2016 is, in product terms, quite late! However, like many people, I purchased an Apple Watch (in my case the Sport edition) for Christmas 2015 and these are my thoughts.

The video features a complete look at the Apple Watch with an unboxing, pairing with an iPhone, looking at some of the default installed (and a couple of third party) apps. I also talk about the good and bad points of the device as well as giving a recommendation of whether, in early 2016, you should be considering purchasing one yourself.

I can’t promise that the demonstration section of this video is exhaustive (it most certainly isn’t) however I hope that during the half hour that this film runs (by far the longest video on my channel so far) you’ll pick up some hints and tips that will be helpful if you’re considering buying a smart watch yourself.

As always, I’d love to hear your comments about the Apple Watch, competing products (like Pebble and Android Wear) and also what you think of my latest attempt at a video!

An introduction to the Apple TV 4

The Apple TV 4 has recently launched and I’ve got my hands on one to review.

In this video I compare the specs of the new hardware with the Apple TV 3 and take a look at the new unit’s operating system: tvOS. I show installing an app, searching with the on-screen keyboard and built in voice command and briefly show Netflix and Plex apps in use.

Finally I sum up what I think of the Apple TV 4 and whether you should think about purchasing one.

I hope you enjoy the content. Please give it a thumbs up and, if you enjoyed the video, I’d be delighted if you subscribed to my channel.

Thanks for watching!

My first VLOG

Something I’ve wanted to do for a while is to VLOG as well as BLOG and this is my first attempt at the former.

I originally wanted to start my VLOG some time ago, however my iPhone 5s had a fault that meant that audio wouldn’t record when using the front facing “FaceTime” camera – the ideal one for VLOGing purposes.

My new iPhone 6s Plus, however, is fully working (I should hope so, I’ve only owned it for 3 weeks!) so I can now give VLOGing a go!

The above video was recorded during the first couple of weeks that I had my new iPhone and covers various subjects – a new bluetooth remote camera activation tool, a selfie stick, backing up my NAS to an external USB 3 hard drive and, most important of all these days, a look inside my fridge! You know it makes sense.

I hope you enjoy the video. Please give it a thumbs up on “the YouTubes” and don’t forget to subscribe if you want to see more content like it. Although goodness knows why you’d want to.

3D Touch on the iPhone 6s

One of the new features of the iPhone 6s is 3D Touch, a new technology that enables the device to not only detect traditional swipes and taps, but also presses into the screen.

The iPhone 6s can not only detect a forceful push on the display – it can also determine how hard you’re pushing it. That’s why Apple have called it 3D Touch – it takes interaction with our mobile devices quite literally into a new dimension.

The best way to demonstrate this new innovation is to show it in action, so please watch my latest video (above) to see it in action!

When I first heard about 3D Touch I thought it might be a bit of a gimmick, however I’m already finding that I use it every day – it really is rather useful.

This is only the beginning, as developers come up with ingenious ways to act on pushes “into” the screen we’ll no doubt begin to see all sorts of interesting new uses for this technology.

If you liked it, please give the video a thumbs up on YouTube and if you enjoyed the content, please subscribe to my channel where you can find lots more!

Comparing the iPhone 4, iPhone 5s and iPhone 6s Plus

Here’s a new video on my new YouTube channel I’ve started up to support my blog. There’s not much to write about as it’s all in the visuals!

The video compares and contrasts three iPhones: the iPhone 4 (released in June 2010), the iPhone 5s (released in September 2013) and finally the newest of the iPhones the 6s Plus released last month.

I think it’s interesting to compare the products’ specs. There’s not much hands-on apart from a quick comparison of how Touch ID has been improved on the 6s.

I talk briefly about 3D touch but think I incorrectly call it Force Touch, so I apologise for that!

I hope you enjoy the video. Please give it a thumbs up on YouTube and if you enjoyed the content please subscribe to my channel to be notified of new videos.

Adding a monitor arm to my desk setup

This week I added a new monitor arm to my desk setup. In this blog (and accompanying video) I describe why I chose the particular arm I went for and how I found using it since installing.

I’ve had a computer for years. So I’ve had different monitors for years too. I’ve never had an all-in-one (like an iMac) instead preferring to add a screen of my choice to a headless computer. The advantage of this approach to owning a computer is that you can change your display without having to change the entire hardware.

When I bought my cylindrical “trash can” Mac Pro in December 2013 (finally received in February 2014 as regular readers will know all about!) I knew that during the lifetime of the machine I would want to add a retina display. The prices and specs at that time were not good so I left it until July 2015 before purchasing an LG 31MU97 true 4K (and therefore “retina” display). It’s fantastic!

One of the criteria for my new screen was that it should include a VESA mount. As I said, I’ve owned different monitors over the years but none of them have ever had a VESA mount and I liked the idea of having that flexibility in my locker should I need it.

The stand that comes with the LG is not bad, as stands go, but for me (as I demonstrate in my video) the height adjustment isn’t quite enough, especially as in an ideal world I would have the screen at slightly different heights depending on if my desk is in a “sit” or “stand” position (I have a video coming up soon showing the sit/stand desk if you’re interested in one of those). Replacing the supplied stand with a new VESA based alternative seemed the appropriate solution.

What to get?!
Then the problems (and the research) start. Should I get an arm or a new stand? Should I get a mount capable of adding additional arms should I decide to get another screen? What manufacturer should I go for? Duronic? Ergotron, perhaps?

I spent quite a bit of time looking around at articles on the intertubes and videos on “the YouTubes” as well as looking at manufacturer’s own web sites and information.

It makes your head spin!?

I eventually stumbled upon a company called Chief based in the United States. I’d never heard of them, however their arms seem well constructed and, crucially, have a rather nifty cable management solution built in – most of the competition use standard clips similar to those found on normal stands.

I selected an arm called a “Kontour” (yes, with a “K”)  that I thought would work well for me. My computer and monitor are black so I rather fancied having the black version of the arm, however it’s not for sale in the UK unless you want to pay a hefty surcharge to have it imported from the US. So I went with the silver version instead at half the price.

Delivery and Installation
I bought it through a reseller that will remain nameless on this blog, however I won’t be using them again – the packaging was simply awful and I was very worried that the goods would be damaged too. However, Chief clearly build solid products so while the box had barely survived the journey, the arm seemed OK. A small scratch, but nothing functionally out of place.

In my video you can see part of the installation. Unfortunately both of my camera batteries were nearly out of juice so I couldn’t record the entire process, however hopefully you can see what I started with and how it ended up.

Final thoughts
The arm itself is of solid construction. It’s well built and the cable management is very nice. As you will see from my video, the spaghetti of cables you could see beforehand has gone and I now have a nice clear desk. I can even position my laptop below my LG should I need to. There was initially a small amount of wobble from the screen, however this was more to do with my desk and some tightening of bolts (on both the arm and the desk) virtually removed this annoyance. It’s certainly not a big problem or one I’m going to remain concerned about.

For the record, the arm I went for was the K1D120S. I would’ve preferred the K1D120B which is the black version, but not at twice the price. I can only comment on what I’ve used and having never come into contact with an arm by anyone else I can’t say whether  this one is better or worse than alternatives, however I’m pleased with what I’ve bought and it makes a nice addition to my 2015 desk setup.

LG 31MU97 4K monitor

Ever since I bought my new cylindrical Mac Pro just over a year ago, I’ve wanted a retina display to plug in to it. That time has finally come as I’ve purchased an LG 31MU97 31″ true 4K monitor.

A “retina” display is one that has so many pixels that you can’t see them. Most desktop displays are around 100ppi (pixels per inch) and this can be seen quite easily with the naked eye. Smartphones and tablets have been using much higher pixel densities for some time now. The iPad is typically 218ppi and some Android phones boast over 300ppi.

Instead of mapping their on screen elements across the same number of pixels, these high resolution screens use “virtual” resolutions with the elements made up of vastly more pixels thereby giving you enhanced definition – you can’t see the pixels (that’s why it’s called “retina” in Apple circles).

The reason I’ve wanted a desktop retina display for a while now is that all of my other computing devices have one – my iPhone 5s, iPad 3 and MacBook Pro all use high resolution pixel packed screens; and they look gorgeous as a result. Ironically, my most expensive computer – the one I use the most and every day of the week has had the worst quality display connected to it.

Don’t get me wrong, the display I’ve been using for the last 4 and half years (a 30″ Apple Cinema HD Display) is the best screen I’ve ever owned. Until now. I’ll be sorry to see it go.

Prices
When the Mac Pro launched in late 2013, Apple demonstrated it attached to Sharp 4K displays. These cost thousands and were out of the reach of most users, especially those of us who had just dropped a load of cash on a new cylindrical Mac masterpiece. Since then I’ve been watching the market closely.

First of all TN panel based 4K monitors dropped in price to a really good level, mainly prompted by gamers who were buying these displays ideal for their hobby. Professional users have been waiting for higher quality IPS panels to do the same thing. It’s taken a while, however there are now plenty of UHD screens out there at an affordable price.

What to buy?
For a long time I wondered what display to purchase. I toyed with the idea of getting 2 Dell P2715q’s. These are UHD (3840 x 2160) and about two thirds of the price of the screen I eventually purchased. I also considered screens from BenQ, Asus and Acer. All of these were UDH.

I finally went for the LG 31MU97. This screen is true 4K (4096 x 2160) and costs a little more than the UHD versions, but with added features. As well as the wider aspect (not entirely necessary for me, but still welcome) it had both sRGB and Adobe RGB compatibility. This is ideal for me with both my web and print work.

I followed a thread over on Mac Rumors for a while with some people reporting problems with the LG, however these were all but solved with the release of OS X Yosemite 10.10.3. I took a bit of a punt buying the display, however I’ve now been using it for 5 days and am very pleased with it!

LG31MU97

The colours pop out of the screen, text is sharp and highly readable and you can’t see any pixels. Despite running D300 cards, the Elite Dangerous Mac beta plays fine at 4K, although with a reduced frame rate. I’m still playing around with the settings, sometimes running at a lower resolution with higher detail and other times lower detail with high res. Not sure which way I’ll finally jump.

For desktop work, I’m using a scaled resolution of 2560 x 1350 which is close to the display resolution of my old 30″ ACD, but a little larger, which is better for my eyesight!

Very please with the purchase. If you have any questions, please post them below and I’ll answer them.

The Mac Pro – Two months on

It’s been a while since my last post, so it’s about time I wrote some more drivel.

In my last post, my Mac Pro had been with me for 2 weeks. Now after 2 months I’ve had a good chance to take it through its paces and, boy, it hasn’t disappointed.

The machine itself is understated, sitting silently on my desk calmly and effortlessly coping with whatever I throw at it. My old Mac Pro struggled to run Windows 8 under Parallels, this new machine performs that job admirably whilst being perfectly capable of not only running Windows 8.1 but also running Windows XP under Parallels and several programs under OS X Mavericks all at the same time. It hardly gets warm while doing it.

Internet Babble That Makes No Sense
That, really, is my point about the Mac Pro. I’ve read several blogs and watched a few YouTube videos where so called experts have said that no-one needs the power of a Mac Pro unless you’re editing 4K video. Some also point out that the single threaded Geekbench score of the latest iMac is slightly faster than the equivalent single threaded score on the entry level Mac Pro.

My response to all this is that no person uses their Mac Pro just to do one thing. We all perform multiple tasks on our machines, whether or not we consciously set out to work like that. Unlike the Haswell chip in the iMac, the Ivy Bridge Xeon processor in the Mac Pro is capable of 2 threads per core, so those commentators who are comparing the quad core iMac to the quad core Mac Pro and saying that it’s the same have no idea what they’re talking about. The new Mac Pro not only has a minimum of 8 threads available but it also has faster memory, faster PCIe based flash storage and faster “everything that connects those things together”. There’s simply no comparison. So stop it.

Astonishing speed
I bought the 6-core model so the 12 threads on my machine are ample for all my work and all the tasks that I need to perform simultaneously. The boot up time of the machine and the software is amazing. My old Mac Pro took several minutes to start up – and that was before I opened any software. I always used to put it into sleep mode instead for that reason. My new machine starts up in seconds – and that includes loading several programs. Photoshop appears in 2 seconds when it took about 3 minutes on my old machine.

In summary
In summary then you can rest assured that I’m not sorry about the money I’ve spent on my new machine. I spend a good deal of my time sitting at it working so, for me, it makes sense to splash the case on my main workhorse. To those naysayers who think it’s a waste of money I say this: “Go back to your crappy Windows machines and see if you can find your way around the badly thought out interface while you scan your system for viruses and leave me to enjoy my new toy.”

And yes, that was over the top. I should probably edit that last bit out…

 

The new Mac Pro is here

Computer February 2014In my last post I wrote that my new Mac Pro had finally left Austin, Texas some 41 days after ordering it and was due to arrive on Tuesday 4 February. It’s 2 weeks later now (18 February 2014) so time to report on progress.

The computer was actually delivered a day earlier than anticipated on Monday 3 February. It arrived just before 9.30am but I was so busy that day that I didn’t even open the box until after 5pm. Quite restrained of me, I thought.

After quickly starting the new machine up to check all was OK, I shut it down again and replaced the 12GB of stock RAM with 32GB I’d purchased from Crucial. I’ve read conflicting reports about the best way to upgrade the RAM. The new Mac Pro handles 4 channel memory really well so the argument goes that having 4 RAM sticks is a good thing to do. I went for 2 RAM sticks so that I can add more RAM later on if I need to without having to throw away the sticks I’ve just bought. I’m working on the basis that you can never have enough RAM and what seems like loads now, probably won’t seem so much in a couple of years.

I’m reminded of the famous Bill Gates quote from 1981 “640K ought to be enough RAM for anybody” although he later claimed he never said it.

Setting things up
I used my old Mac Pro for very nearly 7 years as my main computer. Thats quite a lifespan for a piece of tech that’s used for several hours a day most of the week. I’ve never been one for installing lots of rubbish on my system, I just don’t see the point, however over time I’ve inevitably collected a certain amount of faff that I no longer need. It was for this reason that I chose to avoid any kind of automated transfer of my files and manually install/copy what I needed.

Much of this process was, in actual fact, complete already as my data is pretty much all held on my NAS so as far as the computer was concerned it was a case of installing applications only. For this purpose I kept the new machine side-by-side with the old one until Sunday 16 February when it was finally time to shut down the old cheese grater for the last time. For that 2 week period I set up my old Dell monitor and a keyboard to use on the old machine, but now that’s gone, the Dell is being used as a second monitor for my new machine (as you can see in the photo above).

Mavericks
The biggest change – apart from the massive speed increase of the new Mac Pro – is having OS X Mavericks as my main operating system. My old machine was stuck on Lion so the move to Mavericks (on my new Mac Pro and my MacBook Pro) brings me right up to date. Mavericks has enabled me to use my second monitor as multiple monitor support on pre-Mavericks versions of OS X was flawed.

Summary
I just wanted to post that the new machine was here. I’ll post more details in the future about the Mac Pro itself and how its shaping up. Watch this space!