OU blog

Personal Blogs

Page: 1 2
Picture of Clive Hilton

Google knows

Visible to anyone in the world


Of late I’ve become sufficiently concerned with Google’s underlying motives in choosing to ‘simplify’ its user privacy policy that I’ve taken a decision to actively retreat from the vast majority of its offerings. Not that I had a great many in the first place; at least, not  by today’s norms. There’s analytics, calendar, a Google+ account  (for all of about a week) and an Android phone that I use rarely and that’s pretty much it. I’ve never been tempted by Google docs and the same is true for Gmail.

Given this relative paucity of committed engagement with Google’s beguiling offerings one could be forgiven for thinking that Google wouldn’t actually have much on me that would enable it to do what it would so very much like to be able to do – which is to serve up a more ‘personalised’ online experience; which is, of course, code for being presented with adverts ‘better’ targeted to what Google thinks is my precise needs.

I tend to use Firefox for most of my browsing, but not exclusively so, and I have all the privacy settings across all the browsers I use set to their most restrictive. I don’t browse while I’m logged into my analytics or calendar accounts and caches and cookies are cleared on closing, no history is retained, ad blockers are installed and activated, ‘do not track’ options checked where available, etc. I also have a few Firefox add-ons that further limit what does or does not make it onto my browser. Yet increasingly, I’m seeing things from Google that lead me to think that Google already knows – or is deducing – a great deal more about my surfing activities than I imagined it might be capable of under the settings I choose to operate under.

Here’s a casual example. This evening I was reading an article on the BBC website about the forthcoming film, Hunger Games. Since the whole Hunger Games thing has rather eluded me until I read the article I did a quick Google on it to get the backstory. When I’d finished the article I returned to the Google search page which (I’m guessing) I must have reloaded because now at the top of the listings was a link to a local cinema that is advertising the film. Now I know that it would have been a matter of plain routine for Google to use my IP address to work out where I am and to extrapolate from there to find out where my nearest cinema that is screening it is located. But it seems to me that Google is also accessing other little nuggets of information that even within my less than co-operative browser settings it has no difficulty in tracking and exploiting.

This could, of course, be complete coincidence but I’ve seen quite a bit of this sort of thing recently. Could it be that cookies or somesuch are actually being passed from site to site as I browse and used to build a picture of my browsing activities? Quite possibly.

With that in mind I decided to install an add-on for Firefox that I’d come across recently. It’s called, ‘Collusion‘ , and its developers say of it that:

“Collusion is an experimental add-on for Firefox and allows you to see all the third parties that are tracking your movements across the Web. It will show, in real time, how that data creates a spider-web of interaction between companies and other trackers”

The results are both interesting and more than a little unsettling. One thing is certain, there are a huge number of agencies out there who are very interested indeed in your browsing habits and with, presumably, a great deal of valuable intelligence to be gained, I don’t see much hope for any of them voluntarily respecting a ‘do not track’ instruction while there’s nothing really to effectively stop them doing exactly that.

Permalink 4 comments (latest comment by Clive Hilton, Saturday, 24 Mar 2012, 21:14)
Share post
Picture of Clive Hilton

So little time

Visible to anyone in the world
Edited by Clive Hilton, Sunday, 4 Mar 2012, 16:34



If ever there was a symbolic illustration of how little free time I have in my life these days, this painting is it. In total it probably took around 8 hours or so to paint.  Sadly, those eight hours were spread across nearly a month. What with TMA marking, day schools, work commitments, client demands, family life, domestic emergencies and the ordinary hustle and bustle of everyday life I began to feel I'd never finish this picture. When I started it I was clean shaven. Over the time it took me to complete it I grew a beard (which I started to paint in) and then a few days ago I shaved the beard off which necessitated the painting out of the beard and getting things back to more or less where I'd started nearly a month ago. I finally completed the picture in the tiny wee hours of this morning. All of the painting was done either very late at night or in the small hours, which I think is reflected here in the slightly weary look, especially round the eye.

There are passages that I really like especially where I managed to achieve something with a single brush stroke (as opposed to my too-often overworked scrubbing around). I wanted to create something rather more enigmatic than the earlier portraits and the off-centre composition adds a tension that I find strangely compelling. My daughter thinks it's spooky; which I like.

Permalink 12 comments (latest comment by Gil Dekel, Monday, 16 Apr 2012, 19:33)
Share post
Picture of Clive Hilton

Starry Starry Night - and then some

Visible to anyone in the world
Edited by Clive Hilton, Saturday, 11 Feb 2012, 01:04



Starry Night (interactive animation) from Petros Vrellis on Vimeo


Occasionally one comes across a piece of work, whether in the world of atoms or the digital realm of bits and bytes that represents a quantum shift of imaginative and creative endeavour; the kind of work that makes one feel very humble indeed. And Petros Vrellis’s sublime, gentle visionary wonder is such a work. His starting point is Vincent van Gogh’s, The Starry Night. The painting itself, as anyone who’s even remotely familiar with it, is a night scene looking out over Arles with the blue mountains on the far horizon underneath which a swirling maelstrom of stars dance and whirl in a sky that fills two-thirds of the scene and bears down on the town and fields below. One can feel dizzy simply gazing upon it. What Vrellis has done is to turn this iconic image into an interactive work of art in its own right.

Admittedly, couched in those often dreadful and much abused terms, ‘interactive art’ – things don’t sound promising. But Vrellis has the delicate aesthetic sensibilities of the finest artist and as such he’s sought – and succeeded - in bringing the picture alive in a way that seems not only natural – but so right that one wonders why it hasn’t been done before. The video shows the painting slowly coming to life; still at first, the painting swirls and daubs grow, swell, glide and pulse with gentle rhythm, light and harmony. And then comes the interaction. As hand and fingers drag across the surface, the vortices organically swell and shift, ebb and flow in response. The magic of Vrellis coding skills is such that movement is not uniformly responsive across the entire picture plane; it’s the sky that responds most freely and fluidly, while the fields and town quietly rock to and fro to a more subtle tune.

Like the original that served as the inspiration, this is a masterpiece. Click on the link underneath the video still image to see for yourself.

Permalink 1 comment (latest comment by Gillian Wilkinson, Sunday, 12 Feb 2012, 10:16)
Share post
Picture of Clive Hilton

When the lights go out

Visible to anyone in the world
Edited by Clive Hilton, Thursday, 2 Feb 2012, 02:26


Anyone with any sensitivity to art could not look upon these images and remain unaffected by it.  Painted between 1967 and 2000 they are self-portraits by American born artist, William Utermohlen, and created at a time when the ravages inflicted upon his brain by Alzheimer's disease were already such that his work tangibly manifests his dwindling capacity to externally reflect upon and render his sense of self. The paintings and drawings stop in 2000.

“He died in 2007, but really he was dead long before that," explains the bright-eyed woman to a room full of sympathetic listeners. "Bill died in 2000, when the disease meant he was no longer able to draw.”

This painting was one of a series presented by Utermohlen's widow, Patricia Utermohlen, and Dr Shelley James at an Urban Times supported event held in the GV Art Gallery, London on 26th January 2012 as part of the Trauma series.

The press notes for the event explain the nature of the exhibition and seek to contextualise Utermohlen's work. While the paintings undoubtedly function as an artistic and artifactual record of the artist's tragic decline into cognitive oblivion, they also serve a role as medical documentary evidence that might contribute to some advance in understanding the aetiology of Alzheimer's:

Doctor Rossor’s team and his nurse Ron Issacs encouraged him to continue drawing and portraying himself. The last self portraits painted between 1995 and 2001 are unique artistic, medical and psychological documents. They portray a man doomed, yet fighting to preserve his identity in the face of an implacable disease encroaching on to his mind and senses. With perseverance, courage and honesty the artist adapts his style and technique to the growing limitations of his perception and motor skills to produce images that communicate his predicament.

The series of paintings created between 1967 and 2000  - and especially those from 1996, painted within a year of the initial diagnosis - record with frightening clarity the dreadful, inexorable destruction of a mind. The last image, drawn in 2000, is almost too painful to regard. It's as though for one final, vanishingly small moment Utermohlen was able to see and know himself for one last time before the blackness overcame him.

I wasn't at the event itself, but a reviewer for the New Scientist, Andrew Purcell, was and his words echo my own horrified realisation of what Utermohlen must have endured before he succumbed to nothingness.

"That Utermohlen was able to continue with his art as his disease progressed amazed the evening's final speaker, Stephen Gentleman, neuropathologist at Imperial College London. “It’s astounding,” he says. “Utermohlen just shouldn’t have had the mental ability to be able to carry on doing these as long as he did.”

"Then came the bombshell - the words that stuck with me and played over in my head as I lay in bed later that evening: “It sounds awful,” Gentleman told me, “but in cases like these, you really hope that the patient themself loses understanding as quickly as possible, because to be in a body whose brain is failing and still have insight into what is going on must be simply horrendous.” The works on display indicate that Utermohlen did not have even this small mercy."

Tragic and unsettling though Utermohlen's final paintings are, when one looks beyond the bleakness of his fate what comes through is the realisation that creative expression and the capacity and overwhelming drive to create, to record and to leave some tangibly unique testament to one's existence appears to be so strong, so primordial and so intrinsic to the human condition that it is metaphorically hard-wired into our brains. And it's only when Alzheimer's has finally wrought its ultimate necrotic havoc and the creative light flickers out that we can reluctantly face the painful reality that while the body might continue to survive, beyond any shadow of doubt, the light of a mind has been forever extinguished.


Thanks to Jeremy for bringing this to my attention.

New Scientist, Culture Lab: Self-portraits of a declining brain

William Utermohlen's official website

Urban Times - Art & Alzheimer's


Permalink 10 comments (latest comment by Clive Hilton, Saturday, 11 Feb 2012, 00:55)
Share post
Picture of Clive Hilton

Drawn to it

Visible to anyone in the world

This portrait is somewhat out of sequence in relation to the recent paintings. It's from one of my sketchbooks and I rattled it off in about twenty minutes one evening in early December 2011. It was one of those sketches that simply drew itself; every line seemed right and the whole thing was effortless - to use a contemporary idiom, I was in the zone. It's weird the way it sometimes happens and to this day I've not been able to reliably get myself into the zone at will, which can be infuriating on those occasions when things simply do not go well and everything seems difficult. Strange, strange processes at work. Until today, I hadn't seen this sketch from the time I initially drew it, and looking at it again I really like it. If only I could turn it on at will.


Permalink 7 comments (latest comment by Alison Hemmings, Friday, 12 Mar 2021, 13:13)
Share post
Picture of Clive Hilton

Lost in time

Visible to anyone in the world

Creativity is intoxicating stuff. Anxious to keep my painting momentum going, I thought I'd have a go at a quick sketch, just to keep things moving. For this picture I painted with the surface horizontal as I looked upwards into the mirror. The diffuse tungsten lighting was largely from behind and a bit to the sides and it cast a tangibly warm hue.

Well, a 'quick sketch' turned into an non-stop eight hour session that ended at four o'clock in the morning! Time just evaporated and when I finally put my brushes down I had it in my mind that it was about midnight. No other creative activity that I do seems to induce this timewarping quality quite so much as painting does. Weird and delightful at the same time.

For what it's worth, I'm really pleased with the result and it's great to see something of the freedom and loose, confident mark making that I used to have beginning to make a return. For me, it's certainly the most successful of my three efforts so far.

Interestingly, I'm beginning to think that age has brought a slightly different way of seeing too. I can't put my finger on it but something is different about the way my eye analyses the scene in front of me now then it did years ago.



Permalink 2 comments (latest comment by Clive Hilton, Sunday, 22 Jan 2012, 16:25)
Share post
Picture of Clive Hilton

Learning to be skeptical about learning styles

Visible to anyone in the world
Edited by Clive Hilton, Thursday, 19 Jan 2012, 23:12

OU students who are or once were enrolled on the U101: Design & Creative Thinking in the 21st Century will no doubt recall the TMA11 assignment, which despite the odd numerical sequencing, is actually the first TMA that U101 students encounter on the module. For those who aren't familiar with it, The nature of the assignment is to get students to undertake a series of non-threatening activities,such as finding as many uses for a paper bag within five minutes, or, using a series of circles printed on a poster, to draw 'roles' that exemplify aspects of their lives. Additionally, students are asked to make comment and pronouncements about their learning styles as determined by the undertaking of an online learning style survey, the results of which lead to the following learning style categories:

  • Visual Learning Style
  • Aural Learning Style
  • Verbal Learning Style
  • Physical Learning Style
  • Logical Learning Style
  • Social Learning Style
  • Solitary Learning

For me, most importantly in my role as a tutor, this is where things get tricky. As part of the feedback, tutors are expected to make meaningful comment on the student's indicated 'preferred' learning style and - more significantly - there is an implicit concomitant assumption that tutors will then use the information to adapt their teaching and tutorial support for each student based on the students indicated learning style preference. That’s quite an assumption. The shocking and inconvenient fact of the matter is that when it comes to learning styles and the effectiveness of teaching in response to learning style preferences, I've found myself moving from a position of relaxed agnosticism on the issue to one of deep skepticism. In short; I've come to think that designing instruction for learning styles is largely an alchemical delusion.

There is something undeniably appealing about the notion that one can teach so much more effectively if one somehow modifies ones approach to the particular preferences of any number of students. But one has to dig only a little below the surface of the idea to become very much aware of real and persistent nagging doubts. But let's go with the flow for a moment. A teacher, faced with a class of, say, 30 students, is fired up with a determination to deliver instruction that responds to the learning style preferences of the assembled students. The obvious first question is, Well, in real, practical terms, how precisely is s/he going to do that? Even assuming that the students learning style preferences fell neatly into any one of the seven learning styles listed above, how can the teacher accommodate all these contrasting learning styles at the same time to all the assembled students within a single lesson time slot?

A more personally empirical piece of evidence that has long cast suspicion in my mind that the learning style mantra is somehow fundamentally flawed is that whenever I take one of these learning surveys I never seem to get the same result twice even when I do the same one in rapid succession. Some of them have me down as a hardline logician, others as some sort of day-dreaming visual learner or as a physical learner with a fondness for risk taking and experimentation. The stark and inconvenient reality is that I am, of course, all those things and a great deal more too; as most people are in life. Surely, if these learning style surveys were worth their salt then they would over time at least flag up a statistically better than evens chance of showing a consistent learning style flavour for any particular individual?

It's a topic I hope to come back to, but in the meantime I leave with the words of Richard E. Clark, Professor of Educational Psychology and Technology Director, Center for Cognitive Technology, University of Southern California:

"Three major reviews of the research on learning styles have been published in top journals in the past decade. All of them have reached the same conclusion. Learning styles do not predict learning under different instructional conditions. There are no "visual" or "verbal" learners etc. No reviews of the research on learning styles have reached a positive conclusion. There are studies of learning styles (many of them designed by advocates or sales people for different style measures) that reach positive conclusions but the reviews conclude that those studies are poorly designed (or at least designed to find positive results for favored style measures)."

Permalink 8 comments (latest comment by Clive Hilton, Thursday, 26 Jan 2012, 18:27)
Share post
Picture of Clive Hilton

A bit less rust

Visible to anyone in the world
Edited by Clive Hilton, Saturday, 14 Jan 2012, 13:05

The quest to dust off the old painting skills continues. Another self-portrait I'm afraid; largely for the same reason I did the first one! This self-portrait is larger than life-size which is the first time I've ever approached a portrait in this way and it took a little while to correctly eyeball the proportions and relationships between the key elements while visually scaling them up on the painting surface. Things seemed to happen much faster this time around and there was more fluency and confidence in the mark making. The whole painting took less than half the time of the original, which is quite a step forward. In feel, it is looser and more 'airy' than the first portrait and the subtle change in lighting tended to soften out details, presumably because of the more diffuse quality of the light.

In case you're wondering why I've got one eye closed, it's because as I've got older my right eye has become a bit lazy and it's sometimes difficult to maintain a single image view when I'm looking intently under these sorts of circumstances. To make things easier I simply close one eye while I'm actually looking at the subject while painting and drawing.



Permalink 2 comments (latest comment by Clive Hilton, Thursday, 12 Jan 2012, 11:47)
Share post
Picture of Clive Hilton

Even less intuitive

Visible to anyone in the world

In a previous post I opined, more or less, that in design specialisms such as interface design, for example, any quest for some sort of universally intuitive solution is unequivocally doomed to failure. The reason for this, I argued, is that before any user of interface-driven devices  can get to grips with them they must, of necessity, call upon techniques, schemas and processes that they've learned, acquired or become familiar with in past engagements with similar - or even not so similar - devices. In short, users call upon experience and familiarity when faced with new interface challenges; intuition - "the ability to acquire knowledge without inference or the use of reason [1]"  - has no part to play in the process at all. Should past experience be of no use in unlocking the mysteries of a new device, then the user has no other option than to try to fathom the underlying working principles from scratch.

If users of new devices can get to grips with them quickly and efficiently then it will have nothing whatsoever to do with some near-supernatural process of intuitive insight but rather the result of being able to bring a wealth of experience and familiarity of other systems to bear on the situation. This adaptive strategy - predicated on hard-won experience - in combination with a willingness to engage in a suck-it-see approach can quickly or eventually lead nascent users of novel devices to a position of useful functioning effectiveness. So why do designers persist in burdening themselves - and their clients - with the challenge of seeking some chimeric impossibility while at the same time condemning some end users to a frustrated and unproductive engagement with the fruits of their efforts? The invidious double-whammy for those users who don't 'get it' is that not only can they not use the device but they are made to feel stupid because they can't engage with something that they are told is so 'intuitive'.

I've come to think that the root of the problem lies with slack etymology. If one were to substitute the word, 'familiar' for 'intuitive' then pretty much all of the vagueness and unnecessary academic contortions that surround the issue of 'intuitive' design disappears. When people are able to get to grips with new devices quickly it is because the interface draws upon much that the user is already familiar with. Conversely, where users are unable to fathom a new device it's because it draws upon little that the user is familiar with. Suddenly, the sticking points and barriers to a user's functional, competent engagement with objects become much easier to identify because they can be measured against the expected range of experience and familiar skillsets that the intended user was assumed to possess when evaluated right at the initial design and testing stages. Where there is a significant mismatch between the expectations of a what an end user is assumed to be able to bring to the party and the reality of the end-users' actual reservoirs of familiarity, then therein lies the fault-line that will trigger user frustration.

Leaving aside the grammatical connotations of using  'intuitive' rather than 'intuitable', my argument here about the difficulty of using 'intuitive' is that the word itself is subject to so much misunderstanding and confusion about meaning that to use it as some arbiter of design success is an exercise in futility of the most pointless order. A quick survey of the literature that touches upon the notion of intuition frequently mentions the difficulty of defining it. A telling example comes from a paper entitled Investigating Familiarity in Older Adults to Facilitate Intuitive Interaction:

"One way of improving the usability and interaction of contemporary devices is to ensure the user interfaces are intuitive to use. There is no concrete definition of intuition." [my emphasis][2]

This bizarre academic posturing is nonsensical at every level. Firstly, its assertion that there is no concrete definition of intuition is plain, factually wrong. The OED has produced a succinct and perfectly clear definition of intuition -"the ability to acquire knowledge without inference or the use of reason". That aside, even were Lawry et al's assertion true, then we have the ludicrous position whereby the claim is that user interfaces can be improved by making them more 'intuitive' but, since we don't know what 'intuitive' means how is it even possible to test the hypothesis? It's a bit like saying, measure this object with this standard that we can't define.

Why Lawry et al didn't think to refer to a standard authority on the meaning of words via a dictionary is not clear, but what's more surprising is that they turned instead to one of the paper's co-authors, Blackler [3], who also doesn't know what intuition means, and in doing so tie themselves up in a tautological knot that turns the whole issue into a mess:

""It is important to make a distinction between intuition and intuitive interaction. Intuition is a cognitive process, while intuitive interaction is the use of intuition during an interaction with a product. Blackler states the following definition:"
“Intuitive use of products involves utilising knowledge gained through other experience(s) (e.g. use of another product or something else). Intuitive interaction is fast and generally non-conscious, so that people would often be unable to explain how they made their decisions during intuitive interaction” "[3]

Blackler's definition is not a definition of intuition. It is, however, in its character, rather closer to a description of  something like 'discernible understanding', which embraces notions of 'experience' and 'familiarity' and 'tacit knowledge'.

Blackler's use of the word intuition is not helpful here. While recognising the role of 'knowledge gained through other experiences' Blacker's account focuses on users who can successfully interact with a device but who are unable to objectively describe how they do so. What it doesn't address is those users who can't engage with a device. The unspoken contention appears to be that such users lack an 'intuitive' insight when the pragmatic reality is that they possess an unfitting background of tacitly acquired skills that, while they might be useful in other circumstances, can't profitably be brought to bear on the particular situation in hand. And under Blackler's explanation, by extension, the fault appears to lie with the user rather than the designer.

If designers and creative thinkers really are to get to grips with the issue of making user interfaces more effective then it's time to start discussing the debate in terms that actually make sense. In a future article I hope to question why intuitable design solutions may not even be always desirable in all cases.


2 Lawry, S., Popovic. V., Blackler, A.(2009) Investigating Familiarity in Older Adults to Facilitate Intuitive Interaction Queensland University of Technology, School of Design, Australia

3 Blackler, A. (2008) Intuitive interaction with complex artefacts: Emperically-based research, Ed., VDM Verlag, Saarbrücken, Germany.

Share post
Picture of Clive Hilton

New blog post

Visible to anyone in the world
Edited by Clive Hilton, Monday, 2 Jan 2012, 23:08

According to John Singer Sargent, "A portrait is a painting with something wrong with the mouth". I know precisely what he means. Over the recent Christmas break I decided that I'd make an effort to get back to painting again. It came as a shock to realise that I hadn't painted at all in more than twelve years - how did that happen? - and I've recently begun to feel quite keenly the loss of not painting anymore.

Determined to strike while the iron is hot, I popped down the artshop to buy a few essentials and returned to my study full of vim and an eagerness to simply get stuck in. In the times when I did used to paint and draw regularly it was largely figurative work that I enjoyed most so I decided that I'd have a go at a self-portrait - primarily for the simple pragmatic reason that no-one else (understandably) would be prepared to sit around doing nothing for hours on end while I dragged the long-neglected and rusty painting skills out into the light of day to probably-horrific effect.

And so it was for the next couple of hours that I sat, undaunted, glaring balefully into the mirror, brush in hand clumsily gobbing blobs of colour on the formally pristine surface as I wrestled to conjure up a likeness. A couple of hours later I conceded defeat. The results were manifestly less than impressive and any possibility that the resulting image might actually bear any resemblence to my own careworn fissog was a notion that even the kindest of readings could only truthfully declare to be somewhat misguided. A more realistic and honest appraisal would assert that the face staring back out from the picture surface was that of someone who'd recently experienced some horrific disfigurement involving either a chemical fire or a lawnmower; possibly both. Yet, strange to say, despite this, I wasn't at all downhearted or frustrated. I'd expected things to be a bit ropey and sure enough they were, but what came out of the exercise was a tangible sense that it hadn't all gone, as I'd feared it might have in confirmation of the 'use it or lose it' aphorism. In amongst the carnage before me there were enough small signs to think that something of the old skill was still in there. Certainly enough to warrant having another go. So with that in mind, I cheerily ripped up the mess in front of me, cleaned up, awarded myself a beer and vowed to have another go the next day.

This time I approached things rather more carefully. I moved slowly. I spent a long time simply looking. I set out to gradually form the broad masses, the relationships between light and shadow, proportions and colour values. I painted slowly. Very, very slowly. Far slower than I ever had in my prime. And very, very gradually, a face began to emerge that I half recognised. Best of all, that wonderful sensation of becoming lost in a painting quietly began to envelope my conscious thoughts and it was only when I began to notice that my feet and legs were aching that I looked up to discover that six hours had somehow evaporated seemingly in minutes and that it had become dark outside. A little more of the painting magic had returned. I'd become hooked again. And so it was more of the same over the next two days; standing, looking painting, looking, another dab of colour, another accent and lots more looking. Some eighteen hours later I decided that I'd done as much as I could. The result isn't great; it's very tight; the sum of the parts doesn't quite add up to the whole and there isn't the freedom in the brushwork and the confidence of mark making that I used to be capable of. And yes, there's definitely something wrong with the mouth; but after twelve years of total abstinence it was far better than I was prepared to hope it would be.

One of my fondest hopes for 2012 is that I can reacquaint myself more fully with my long-lost dear old friend, the painter.


Permalink 2 comments (latest comment by Clive Hilton, Tuesday, 3 Jan 2012, 00:48)
Share post
Picture of Clive Hilton

There's nothing intuitive about intuition

Visible to anyone in the world
Edited by Clive Hilton, Friday, 25 Nov 2011, 08:17

In the worlds of ergonomics and user experience design it seems that one of the most desirable of all outcomes - the golden ideal - is to conjure up a solution that, while it may be novel in concept to new users, is nevertheless rapidly and painlessly understood and absorbed by novitiates when encountered for the first time. This is especially true in areas where there is a strong commercial or life-threatening imperative. Such ideal solutions are not uncommonly described as 'intuitive'. But are they intuitive? Really? Indeed, is it ever even possible to design any truly intuitive interface at all? (Incidentally, shouldn't that be 'intuitable'?)

The OED defines intuition as, "the ability to acquire knowledge without inference or the use of reason". In plain terms, intuition means that people 'just get it', whatever 'it' happens to be. In the context of application interface design, any strict interpretation of this definition would seem to imply that practically any reasonably functioning person could be placed in front of some popular computer application interface that is widely held to be 'intuitive' and - let us say - despite having no prior experience of computers at all would quickly work out how to use it 'without inference or use of reason'. Manifestly, the chances of this actually happening are probably going to be confined to the vanishingly thin extreme end of a probability bell curve labelled, 'miraculously unlikely'.

Any protest that goes along the lines that such an example is unreasonable (because of the subject's complete lack of experience with computers per se) is logically contradictory and powerfully self-defeating. Either a thing is intuitive (ie, one simply 'gets it') or it is not. If someone who has never used a computer before can rapidly use it at first sight then that would appear to be an intuitive act; even more so if lots of such people could pull off the same trick. If (as is more likely), a person does not intuit how a computer works because she or he has never used one before then what is lacking is not some intuitive insight - quite the opposite. Crucially, what is lacking here are the very things that intuition has, by definition, no requirement for at all - prior knowledge, familiarity and direct, hard-won experience.

Demonstrably, intuition and intuitive insight have no part in the process at all. It's a theme I'll aim to pick up on further.

Share post
Picture of Clive Hilton

30 second dog[gerel]

Visible to anyone in the world
Edited by Clive Hilton, Wednesday, 16 Nov 2011, 10:07

Never sit upon the seat of a bicycle made for two
unless it's pointing forward.
(You know what it could do!)


For the background to the origination of these 30 second offerings, all is explained in a previous post.

Share post
Picture of Clive Hilton

Advertising - the madness

Visible to anyone in the world
Edited by Clive Hilton, Tuesday, 15 Nov 2011, 10:55

In our developed consumer culture it is now virtually impossible to escape advertising. There is seemingly  no piece of real estate or virtual space too small to hold an advert. Brands now pay games developers to place adverts and sponsor notices inside virtual worlds. Fill up at a petrol station and look at the pump handle - and there's an ad. Look up and you'll see loads of them. By the time you've walked to the till to pay it's probable that you will have seen literally hundreds more.

Of course, to some extent, one develops a sort of selective blindness to most of them, but one simply can't escape them all. What's really beginning to get my goat though is the extent to which I now feel overwhelmed by them. Take the X-Factor, for example. I'm personally not a fan of the Saturday evening brain rot that is The X-Factor (with the X representing the amount of money Simon Cowell makes from the whole depressing affair) but others in the Hilton Towers household love it. What struck me forcibly was the sheer quantity and frequency of ad breaks. I was not alone, it seems. There have been complaints and some figures declare that for every 7 minutes of programme time there are 4 minutes of unbroken ad breaks. Ye gods!

And don't think the BBC is immune from the ad plague, either; the only difference with the beeb is that their adverts are simply for its own offerings, though I would concede that they don't seem as numerous or protracted. Mostly.

I now read that Haper Collins is seriously considering placing ads in ebooks:

“Certain kinds of books [fiction] create immersive reading experiences whereby ads would be too interruptive for readers, and publishers and even advertisers aren’t likely to put a premium on that," said HarperCollins group digital director David Roth-Ey, in an interview with New Media Age.

"But information books, for example a Collins birds guide, could provide very valuable real estate for contextual advertising - in this case potentially a binoculars manufacturer,” he added.

Oh goody! So for the time being at least, it seems we're to be spared adverts in fiction, but as far as non-fiction goes let the floodgates open.

But I guess what really gets my goat is that all this junk - for which consumers pay handsomely - only ever results in an inferior, frustrating experience for the user. The worst expression of this that I can conjure up is the experience of viewing a legitmately acquired and paid-for DVD film. First of all they ain't cheap. Then when you want to watch them you are presented with a non-skippable bollocking telling you about the evils of pirated DVDs and how it's killing the film industry! [Er, right, that's why I bought this DVD, so why are you lecturing me!] This is followed by often-unskippable film previews - essentially adverts - that are also pushed down your throat. Eventually, some ten, fifteen minutes later, you are finally allowed to watch the damn film you paid good money to watch 'at your own convenience'.

Qualitatively contrast this frustrating and abusive experience with what happens when one views films from, er, shall we say, unconventionally sourced DVDs (that I may or may not have viewed on the odd occasion somewhere). Stick said DVD into player. Hit play. Watch. Eject when finished. No ads, no guff, no hectoring. Bingo!

OK the film quality itself might not be as good as it could be, but frankly that becomes less of an issue in the overall scheme of things. Surely, paying handsomely for a thing should result in the optimum experience, not the worst.

Now, dear reader, tell me where's the incentive to play it straight and legal?



Permalink 3 comments (latest comment by Mark Becker, Wednesday, 30 Nov 2011, 21:07)
Share post
Picture of Clive Hilton

How touching

Visible to anyone in the world
Edited by Clive Hilton, Monday, 14 Nov 2011, 10:45

In the world of computer interfacing, touch screens have been around for yonks. The mass uptake of smartphones and tablets has in no small part been due to the brilliance and efficacy of the touch screens as applied to these particular form factors and the ability to directly engage with content and the interface by intuitive finger swiping actions as against other more traditional means such as the keyboard, mouse of stylus is demonstrably better.

Set against this ergonomic efficacy, of course, are a couple of trade-offs that consumers seem content to put up with. Both centre on the ubiquitous shiny, highly reflective screens that mean using such devices under bright lighting conditions - as on a sunny day, for instance - is near impossible and, to compound matters, the screens quickly take on the appearence of having been used by someone whose fingers have been marinated in lard for  a week or two. Of course the issue is easily solved and manufacturers need only furnish their offerings with matt screens and the issues more or less goes away, though presumably, manufacturers would argue that in doing so the devices would lose some of their visual sexiness. Whichever the case, the point here is that for such devices, touch screens clearly work and work well.

Given the success of touch screen on portable tablets and smartphones, one can easily see why the likes of Microsoft and others would want to extrapolate the model and extend it to large format devices, such as PC monitors and large screen all-in-one computers. From what I've read of the proposed offerings of Windows 8 - the next generation of Microsoft's operating system - one of the key paradigm shifts is an overwhelming move towards touch screen interaction across all hardware configurations, including large format computer screens. From an ergonomic perspective, this is a pile of poo.

It's one thing for Tom Cruise to whizz stuff around on a large glass screen in Minority Report via a series of rather impressive choreographed hand and arm gestures, it's another thing entirely doing the same thing in real life. The stark reality is that to oblige someone to hold their arm extended for long periods of time while they exercise fine motor control of their wrist and finger tips is to rapidly inflict upon them a whole world of pain and neurological distress. And I'm not exaggerating.

Try it for yourself. Go through the motions of pretending that you have a touch screen computer monitor, as you sit in front of it, reach out and 'interact' with the screen. 'Move' things around on the screen, as you imagine you would have to, with your arm fully or partially extended. Do it for fifteen minutes or as long as you feel comfortable with. Try 'browsing' the web as you would normally, flitting frequently from site to site. Feel your shoulders tensing up? Feel your arms aching? Feel any tension building? Aside from the physical sensations, how good does the screen now look with a trail of chip fat fingerprint smears all over it? If you had to do this all day long - if this was your only way of interfacing with your computer - how long would it be before discomfort turned into excruiating agony? A day; a week; a month; a year?

Now it might be that between now and final release Microsoft might have a change of heart about employing touch screen interfacing as the default means of user interaction (I'm told that there is a poorly implemented option to revert to a traditional mode of working for users who don't have touch screens). I'm not entirely convinced that Microsoft have thought this one through. What works well for one form factor does not necessarily work for another. Just because a thing can be done doesn't mean that for all circumstances it should be done.

Get it wrong and quite simply it's a pain in the neck for everyone.


Permalink 1 comment (latest comment by Jeremy Ashcroft, Tuesday, 15 Nov 2011, 16:19)
Share post
Picture of Clive Hilton

How interesting...

Visible to anyone in the world

The apocryphal Chinese malediction, 'may you live in interesting times', seems to have come to pass, it would  appear. At the time of writing, Italy has just seen the cost of it's borrowing hit 8% per annum - beyond the level at which Ireland was forced to seek additional aid to stave off bankrupcy. Greece has already defaulted in all but name and the likelihood is that Portugal may well be next in line. I'm not an economist, but, as best I understand these things, the real fear is that the emergency funds available to help 'bail out' the countries in trouble simply may not be enough and in that event, well, put it this way, history tends to show that the consequences can be bad. Very bad. There can be little doubt, it seems to me, that we are witnessing the first significant tremours in what will be a fairly swift and possibly very distressing shifting of the world order.

It's likely to be very messy. Conceivably, revolutionary (in the truest sense of the word). Possibly calamitous. Based on the events of the 1930s, it could even lead to war.

In that context, as an adjective, 'interesting' seems barely adequate.

Permalink 3 comments (latest comment by Clive Hilton, Thursday, 10 Nov 2011, 00:11)
Share post
Picture of Clive Hilton

Dog tired. Streams of consciousness

Visible to anyone in the world

For some bizarre reason I’ve recently been emerging, still knackered, from my night’s sleep only to catch my subconscious mind unawares that it has been caught red-handed in the act of, what I can only fairly acccurately describe as, ‘arsing about with words’. Now for most of my life to date, the act of awakening and finding oneself obliged to kickstart the old metabolism into gear is usually accompanied by a short continuation of a dreamlike state in which pictures and scenes swirl around, as they do in sleep, before full-on consciousness rears it’s ugly head and it’s time to get up and let the images fade to grey. So far so normal.

Of late, however, I still get the familiar dreamy state only now it’s words and sentence that rush about in a mad stream of consciousness. Words tumble and fall, tripping over themselves in a rush to make themselves heard, and then what happens is that they seem to shape themselves into short bursts of rhyme. It might be only a few words, and then suddenly two or three sentences will start to buzz along and then these give way to little bits of doggerel that conjure themselves up seemingly without any effort on my part.

Indeed, just as in a dream, if I try to concentrate on the words they seem to recede out of range. I’ve learnt to just let the show flow along and see what happens. And what happens is that occasionally, one of these bits of doggerel pops out fully formed very quickly - probably in less than 30 seconds. Mostly - and this too is just like a dream - no sooner has the verse appeared than it starts to fade and I struggle to remember them. However, some of them appear to be more memorable to my creaking neurons than others so I memorise them before they disappear. Here’s one them:

I saw a spider on the stair,
I squashed it flat with half-a-pair
of ballet shoes my daughter bought.
She used them once. It came to nought.

Now I know it’s not exactly Keats or Wordsworth - or even Lear, for that matter - but as a piece of stuff that my subconscious brain kicked out in less than thirty seconds it certainly makes me wonder what else it’s getting up to while I’m not looking.

Permalink 3 comments (latest comment by ROSIE Rushton-Stone, Thursday, 3 Nov 2011, 02:24)
Share post
Picture of Clive Hilton

Censure and sensibility

Visible to anyone in the world

OK, mildly contrived and technically inaccurate subject title I know.

Personally, I think that a recent ODS posting of the sexy/salacious/glamourous/disgraceful/beautiful/morally corrupting image* is one of the best things that's happened across the U101 presentations to date.

The debate that it has sparked is fantastic and has led to some really interesting drifts in the discussion to new areas that are so relevant in today's complex society and which, as design and creative thinkers, I believe we absolutely have to engage with.

We're seeing argument on issues such as censorship, morality, aesthetics, social and cultural responsibility, tolerance (or lack of) and the questioning of whether the OU or 'someone' ought to be exercising some sort of tangible control of events.

I must congratulate all parties for their contributions, which in my view are helping to guide the discussions away from the initial moral outrage as evinced by the 'Disgusted of Tonbridge Wells' sort of response into something rather more considered and nuanced.

My own view (which I'll repeat here) is that had there been a proscriptive policy in place that would have prevented the posting of the original image, then we'd all suffer a credibility issue - students, tutors and beyond. The U101's nascent reputation as a worthy and serious creative thinking degree level course can only be enhanced if it's clear to all that we believe that we encourage debate, that we challenge ideas, that we are prepared to discuss complex and potentially contentious issues as mature, open-minded members of society.

Or we could simply talk about bananas.

*It was an image of a women in a skimpy 'bathing suit'.

Permalink 3 comments (latest comment by Jeremy Ashcroft, Friday, 28 Oct 2011, 21:07)
Share post
Picture of Clive Hilton

Will's words. Note to self

Visible to anyone in the world
I like hearing Will Self speak and I like to read his writings, though I have to admit that, to date, it would appear that, perhaps tellingly, I don't like them enough to want to buy one of his novels. And it was while reading an article of his in the Guardian online today that I was able to pin-point why I'll probably continue to politely decline his longer written offerings; simply, they would take me too long to read.

Listening to Will speak or reading his written works is like - for me - learning English all over again. I always emerge from our encounters feeling like I've had a damn good workout; a bit knackered, though, despite the discomfort, suffused with a sense that I'll be better off for it in the long run. He routinely uses words I've literally never encountered before and he throws around an immensely wide-ranging and, on occasions, a selectively arcance vocabulary with consumate ease. He's brilliant at it and I really enjoy engaging with his use of language, but I'm never in any doubt with Will that there will come a point somewhere - and it usually happens pretty early on - that the inadequacies of my literary education will reveal themselves only too clearly and I will have no other discernable option other than to reach for the dictionary to discover the meaning a yet another new word that he's mined from his vast lexicon.

The article I referred to, "Will Self: The trouble with my blood" is neatly summarised in a sub-heading, "Diagnosed with a rare blood disease, Will Self has to endure weekly 'venesections' in hospital. He reflects on illness, addiction and mortality".

And so he does. In the scheme of things an article of this length would normally consume no more than a minute or two of my time and would rarely trouble me in terms of getting to grips with mere wordage. But with Will, well things are different. It took nearly half-an hour and much looking up of words. Here's just a few words and phrases that I'll admit to having had need to check out in a process of on-going diligent self-education:

Iatrogenic, apoptosis, acuminate, 'veridical Guignol', 'fictive inscape'.

It's heady and intoxicating stuff. How about this for a gem of a sentence:

"I had trafficked in disease as a metaphor for 20 years now, grafting the defining criteria of pathologies – their aetiology, their symptoms, their prognoses and their outcomes – on to phenomena as diverse as the human psyche and the urban fabric, yet now I had a disease that seemed to me to be a metaphor – although of what exactly I couldn't yet divine – I found myself in a viscid substrate, cultured with rapidly multiplying literalisms. "

Er, quite.

Don't get me wrong; I'm not taking the mickey here. In a world in which txtng m8 is the norm and which there seems to be so much slack, casual and plain lumpen use of language about that to come across writing of this complexity  - and let's be honest - challenging complexity for many (most certainly for me at any rate) is to savour very nutrient-rich fare indeed.

That said, just as with so much richly flavoured cuisine, it's fine as an occasional treat for which one is happy to put the time aside and make a special occasion of it, but, as likely as not, at a daily level you can have too much of a good thing.

Permalink 3 comments (latest comment by Jeremy Ashcroft, Friday, 28 Oct 2011, 19:43)
Share post
Picture of Clive Hilton

How long from saint to sinner?

Visible to anyone in the world
Edited by Clive Hilton, Wednesday, 19 Oct 2011, 00:08

Steve Jobs' death, as one might not unreasonably expect, elicited a lot of eulogising. A very great deal of eulogising. Verging on hysteria kind of eulogising. Y'know, the Princess Di kind of eulogising. Irrational. Barmy.

Now, for Mr. Jobs' family and friends his death is nothing short of a tragedy in the fullest, starkest sense of the word and for them the world will never be the same again. The pain may dim with time, but I doubt very much that it will ever fade away and no normally functioning human - even me - could not feel empathy for the recently bereaved family and close ones. For the rest of us, and by that I mean everyone who didn't actually know him, I can confidently say, without meaning to appear callous, the world will return to normal somewhat quicker. Life goes on, hey ho.

Now, I didn't know Mr. Jobs and I've never been a fan of Apple [which doesn't make me a Windows PC fanatic either - I simply can't stand those tiresome Apple v Microsoft turf wars] so at a personal level I just haven't felt the need to wail and bemoan his loss quite so vociferously as some. As it happens, much as I admire what he did to promote an appreciation of and an appetite for expensive, high quality product design among mass consumers, it seems to me that with a cooler reflection on what Apple did actually achieve and has come to mean at a wider level, it's possible that history won't look upon Jobs' legacy as kindly as it does at the moment.

Where once Apple sought to paint itself as a sassy, fun-loving individualistic and 'different-thinking' counter-culture, it has now become the very monster of the type it used to rail against; bullying, ruthless in its suppression of competition by litigious muscle, driven by a determination to cream off a hefty percent of all content that passes through the digital entrails of its products and seemingly desperate to crush the movement for open standards - or perhaps more accurately, anything that isn't Apple standard. And not least, there's the exercise of an iron grip over what the purchasers of its products are permitted to do with them.

Imagine that you bought a TV and the makers of the TV charged you for viewing content as well taking a 30% cut from the actual makers of that same content (while perhaps even telling them that they could only make content using the TV maker's proprietary systems). Suppose you then want to record a programme and transfer it to your own computer - but the TV makers say you can't. Insane? Well, that's pretty much the Apple paradigm in a nutshell.

You can do what you like on an exquisitely designed Apple product. As long as it's what Apple decides you can do. And for which, of course, you will pay. And not just in monetary terms.

Some legacy.




Permalink 5 comments (latest comment by Clive Hilton, Friday, 28 Oct 2011, 20:30)
Share post
Picture of Clive Hilton

Allez les bleus

Visible to anyone in the world
Edited by Clive Hilton, Wednesday, 12 Oct 2011, 22:37

Sincere congratulations, France. The best team won and to the victor the spoils.


Permalink 2 comments (latest comment by Colette de Colbert, Monday, 17 Oct 2011, 22:05)
Share post
Picture of Clive Hilton

Eddie, you need to ease off a bit..

Visible to anyone in the world
For those who may not be aware of such things, Eddie Butler used to be a rugby player, and now retired as such, he has been earning a living as a rugby journalist. He is also Welsh. Which of course means that he cannot stand the English. Or at least, the England rugby team. Which is not unusual. Let’s be honest, that puts him among a comfortable majority of the English-hating rugby following world for reasons - as an Englishman - I’ve never really got to grips with. Why do they love to hate us? History? By definition, that’s a long time ago. Why, if you ask a neutral in any game against England who would they want to win, would they say, ‘Anyone But England’? I don’t know. As an England supporter for more than forty-plus years I’ve always sort of recognised this. So how do I feel about it?  Well, truth be told, I don’t really think about it much at all. It’s not really my problem. I want England to win. Simple as that. If England aren’t playing at all, I’m not really that interested. I don’t really care who England play, if they win, I’m thrilled. If they lose, I’m gutted. Beyond that - well, so what?

Anyway, Eddie wrote a piece in The Guardian today and one gets the sense that he somehow lost the battle of pretended journalistic objectivity versus chippy bitterness :

“Rugby World Cup 2011: Defiance at heart of Martin Johnson’s strategy

The stubborn manager is determined to obliterate the whiff of scandal with the smell of success”

Powerful medicine. Ostensibly, it was a piece that sought to build on a media-generated ‘scandal’ concerning the on and off-duty behaviour of the England squad by offering the not-unreasonable proposition that the best counter that Martin Johnson could offer to the right-minded critics of the England squad would be to simply beat the French at their forthcoming encounter. So far, so not-so-original. But then comes the most weirdly non sequitur piece of sensationalist bollocks-speak concluding summary I’ve seen written in a mainstream newspaper:

“It is France now, as it was then. Find the way to beat them again. Smash them. And then smash Ireland or Wales. And then England are in the final. Do you smell that? That’s the stench of winning.

As long as it works. If the Johnson way fails, then the whole house of Twickenham, rotten, decaying, comes down too. Now that would be a smell.”

Eddie, whatever it is you’ve been drinking, stop now.

Share post
Picture of Clive Hilton

Why do I do it to myself?

Visible to anyone in the world
Edited by Clive Hilton, Sunday, 2 Oct 2011, 01:02

Good grief! I've posted previously on what it's like for me to be an England rugby fan. I've been following the England team now for more than forty years and believe me, the accrued pain far outweighs the joys by miles. And today, was a great illustration of what it means to be an unconditionally loyal fan (and it might be worth remembering that 'fan' is a condensed form of the original 'fanatic'). The game against Scotland was among the very worst I've witnessed, at least as far as being an Englishman is concerned, though truth be told, for altogether starker reasons, it was even worse for any Scot.

Scotland, as ever when they play against England, played as though their very souls were at stake. England, did what they can be so good at doing - play with the all the spirit, imagination and athletic energy of Winnie the Pooh's most lethargic of friends, Eeyore. Where the Scots have history, grievance and vengeance to fuel their fires, England have placid, bovine indifference and a capacity to absorb pain. Slowly.

And yet, at some terrifyingly late moment in the game it was a though the stultifying torpour that had settled on the men in white from the moment they stepped on the pitch had suddenly been cast from their collective mind as the players suddenly awakened from their somnambulant meanderings to realise the peril they were in. In a flurrry of late-found belief they kicked into life and in the dying moments delivered the killing blow.

England won, yet few honest English supporters would say the better team triumphed. So it looks like Scotland, barring an unlikely miracle of circumstance, will go home while England stagger on; befuddled, inept, musclebound, bereft of ideas and just as likely to get hammered by the French as they are to suddenly wake up and believe in themselves and post a monster score. And all I can do is to frighten my children as I hurl invectives at the TV screen.

There's got to be a better way of expressing pack loyalty.

Permalink 2 comments (latest comment by Clive Hilton, Sunday, 2 Oct 2011, 18:02)
Share post
Picture of Clive Hilton

If it wasn't for the melancholy I'd be nostalgic

Visible to anyone in the world
Edited by Clive Hilton, Monday, 26 Sep 2011, 21:36

Jumpin' Jeehosaphat! Where the Sam Hill did the last eight months go? Seems like only minutes ago I was getting  to grips with the new list of students for the U101 module. (It used to be a course but it went metric, so now it's a module.) Rather delightfully as, I'm coming to discover, is the way of these things at the OU, it would appear that the kind of people who are drawn to it are, generally speaking, thoroughly good eggs. U101 (or LOLA to initiates) seems particularly good at attracting the kind of people I'd actually like to meet and have a drink with. As long as they'd been chemically coshed before hand, of course. Even via an almost entirely electronically delivered module, I still end up with a palpable feeling that over the last eight months I've gained a sense of the personalities of the students I've engaged with - even if only virtually.

So it is with that sense of bitter sweet happy-sadness that I brace myself again for bidding au revoir to another cohort to pass by my way on the road to greater enlightenment. It's not quite yet over though it has been fun.


Share post
Picture of Clive Hilton


Visible to anyone in the world
Edited by Clive Hilton, Saturday, 27 Aug 2011, 11:51

Mind mapping cartoon

I have a small confession; [whisper] while I do occasionally use mind-mapping myself, I remain a bit of a skeptic when it comes to some of the frankly miraculous claims made for it's supposed efficacy. Generally, lazy superficial maps are as bad as lazy superficial notes, only less meaningful.

Anyway, here's an irreverent take on mind-mapping that looks to puncture much of the unwarranted hype surrounding it.

Tony Buzan, take note.


Permalink 2 comments (latest comment by Clive Hilton, Monday, 5 Sep 2011, 17:33)
Share post
Picture of Clive Hilton

We did nothing wrong...

Visible to anyone in the world
Edited by Clive Hilton, Friday, 15 Jul 2011, 23:29

Sometimes,  it's not even necessary to comment.


Permalink 2 comments (latest comment by Clive Hilton, Wednesday, 21 Sep 2011, 13:29)
Share post
Page: 1 2

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 4019