OU blog

Personal Blogs

Christopher Douce

Professional Development Conference: London, 22 March 2014

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 12 Oct 2022, 09:00

The Open University in London runs two professional development conferences per year, one at its regional offices in Camden town, the other at the London School of Economics. Saturday 22 March was a busy day; it was the day I ran my first staff development session at this venue.  (I had previously run sessions in the Camden centre, but running a session in an external venue had, for some reason, a slightly different feel to it).

This blog post aims to summarise a number of key points from the session.  It is intended for anyone who might be remotely interested, but it’s mostly intended for fellow associate lecturers.  If you’re interested in the fine detail, or the contents of what was presented, do get in touch. Similarly, if you work within any other parts of the university and feel that this session might be useful for your ALs, do get in touch; I don’t mind travelling to other regions. 

Electronic assignments

The aim of the session was to share what I had discovered whilst figuring out how a tool called the ETMA file handler works.  Students with the university submit their assignments electronically through something called the Electronic Tutor Marked Assignment (ETMA) system.  This allows submissions to be held securely and the date and time of submission to be recorded.  It also allows tutors to collect (or download) batches of assignments that students have submitted.

When assignments are downloaded, tutors use a piece of software called the ETMA file handler.  This is a relatively simple piece of software that allows tutors to get an overview of which student has submitted which assignment.  It also allows tutors to see their work, allowing them to comment (and mark) what they have submitted.

There are three things that a tutor usually has to do.  Firstly, they have to assign a mark for a student’s submission.  They usually also have to add some comments to a script that has been submitted (which is usually in the form of a Microsoft Word document).  They also have to add some comments to help a student to move forward with their studies.  These comments are entered into a form that is colloquially known as a PT3.  Please don’t ask me why it’s called this; I have no idea – but it seems to be an abbreviation that is deeply embedded within the fabric of the university.  If you talk to a tutor about a PT3 form, they know what you’re talking about.

Under the hood

Given that the tutor marked assignments constitutes a pretty big part of the teaching and learning experience in the university, the ETMA file handler program is, therefore, a pretty important piece of software.  One of my own views (when it comes to software) is that if you understand how something works, you’ll be able to figure out how to use it better.

The intention behind my professional development session was to share something about how the ETMA file handler works, allowing tutors to carry out essential tasks such as make backups and move sets of marking from one computer to another.  Whilst the university does a pretty good at offering comprehensive training about how to use the file handler to enable tutors to get along with their job of marking, it isn’t so good at letting tutors know about how to do some of the system administration stuff that we all need to do from time to time, such as taking backups and moving files to another computer (hence my motivation to run this session).

One of my confessions is that I’m a computer scientist.  This means that I (sometimes) find it fun figuring out how stuff works.  This means that I sometimes mess around with a piece of software to see how to break it, and then try to get it working again.  (Sometimes I manage to do this, other times I don’t!)  During the session I focussed on a small number of things: how the file handler program knows about the assignments that have been downloaded (it uses directories), how directories are structured, what ‘special files’ these directories contains, and where (and how) additional information is held.

Here’s what I focussed on: the directories used to download files to, the directories used to return marked files from and how the file handler reads the contents of those directories so it is able to offer choices a tutor.  Towards the end of the presentation, I also presented a number of what I considered to be useful tips.  These were: the file hander software is very stupid, the file handler software needs to know where your marking is, form habits, be consistent, save files in the same place, use zip files to move files around, and be paranoid!

Reflections

Whilst I was writing the session, I thought to myself, ‘is this going to be too simple?’ and ‘surely everyone will get terribly bored with all this detail and all the geeky stuff that I’m going to be talking about?’  Thankfully, these fears were unfounded.  The detail, it turned out, seemed to be quite interesting.  Even if I was sharing the obvious, sometimes a shared understanding can offer some reassurance.

There were parts that went right, and other parts that went wrong (or, not so well as I had expected); both represented opportunities for learning.  The part that I almost got right was about timing.  I had an hour and a half to fill, and although the session had to be wrapped up pretty quickly (so everyone could get their sandwiches), the timing seemed to be (roughly) about right.

The part that I got wrong wasn’t something that was catastrophically wrong, but instead could be understood in terms of an opportunity to improve the presentation the next time round.  We all user our computers in slightly different ways, and I have to confess that I became particularly fixated in using my own computer in quite a needlessly complicated way (in terms of how to create and use backup files).  As a result, I now have slightly more to talk about, which I think is a good thing (but I might have to re-jig the timing).

There is one implicit side effect of sharing how something is either designed, or how something works.  When we know how something works, we can sometimes find new ways of working, or new ways to use the tools that we have at our disposal.  Whist probing a strange piece of software can be a little frightening it’s sometimes possible to find unexpected rewards.  We may never know what these are, unless we spend time doing this.

And finally…

If you’re an associate lecturer, do try to find the time to come to one of the AL development events; you’re always likely to pick something up from the day (and this applies as much to the facilitator as it does to the tutor too!)  As well as being useful, they can also be good fun too!

After the session had been completed, and the projectors and laptops were turned off, I started to ask myself a question.  This was: ‘what can I do for the next conference?’  Answering this question is now going to be one of my next tasks.

Permalink Add your comment
Share post
Christopher Douce

Associate Lecturer Professional Development Conference: Kent College, Tonbridge

Visible to anyone in the world
Edited by Christopher Douce, Monday, 24 Mar 2014, 14:14

The Open University in the South East ran one of their associate lecturer professional development conferences on the 1 March 2014.   This year, the conference was held at Kent College, Tonbridge.  I don’t know whether I wrote about this before, but this was the same where I attended my first ever OU tutorial (as a rookie tutor).  Today, the site is very different. Then it was gloomy and dark.  Now, the buildings are bright and airy, and boasted a spectacular view of the Kent countryside.

This post is a very brief summary of the event.  The summary has drawn directly from the notes that I made during the day (and these, by definition, will probably contain a couple of mistakes!)  It also contains a bunch of rough reflections.  I should add that this blog is primarily intended for other associate lecturer colleagues but it might accidentally be of wider interest to others too.

During this conference, I signed up for two sessions.  The first was entitled, ‘supporting academic writing’.  The second session was all about, ‘aligning TMA feedback to students’ needs and expectations’. 

Supporting academic writing

This first session was facilitated by Anna Calvi, who projected a set of phrases about academic writing onto a digital whiteboard.  A couple of examples were, ‘what is a semi-colon?’ and ‘I think of ideas and information as I write’. ‘Do any of you recognise these?  Which are the most important for you?’ Anna asked, challenging us to respond.  She didn’t have to wait long for an answer.

A couple of responses that I noted down were: explaining why structure is important, the importance of paraphrasing and differences between written English and spoken English.  There’s also the necessity to help students to understand what is meant by ‘written academic English’.  Some suggestions were immediately forthcoming: the choice of vocabulary, style and appropriate referencing.

One of the participants asked a question that I have heard asked before.  This was, ‘can all faculties have a module that helps students to write descriptively?’  The truth of the matter is that different faculties do different things.  In the Mathematics Computing and Technology module, writing skills are embedded (and emphasised) within the introductory level 1 modules.  Other faculties have dedicated modules.  Two key modules are LB160 Professional Communication Skills for Business Studies, and L185 English for Academic purposes, which I understand can contribute credit to some degree programmes.

During this session, all the tutors were directed towards other useful resources.  These include a useful student booklet entitled reading and taking notes (PDF) which is connected to an accompanying Skills for Study website (OU website).  Another booklet is entitled Thinking Critically (PDF).  This one is particularly useful, since terms, ‘analyse critically’ and ‘critically evaluate’ can (confusingly) appear within module texts, assignments and exams.

One of the points shared during this first session was really important: it’s important to emphasise what academic writing is right at the start of a programme of study.

What needs to be done?

So, how can tutors help?  Anna introduced us to a tool known as the MASUS framework.  MASUS is an abbreviation for Measuring Skills of Academic Students and has originally come from the University of Sydney.  We were directed to a video (OU website) which describes what the framework is and how it works.  A big part of the framework (from what I remember), is a checklist for academic writing (OU website).  In essence, this tool helps us (tutors) to understand (or think about) what kind of academic writing support students might need.  Key areas can include the use of source materials (choosing the right ones), organising a response in an appropriate way, using language that is appropriate to both the audience and the task, and so on.  In some respects, the checklist is an awareness raising tool.  The tutor’s challenge lies in how to talk to students about aspects of writing.

If you’re interested, there’s a more comprehensive summary of the MASUS framework (PDF) is available directly from the University of Sydney.  Another useful resource is the OU’s own Developing academic English which tutors can refer students to.  We were also directed to an interesting external resource, a Grammar tutorial, from the University of Bristol.

Offering feedback

After looking at the checklist and these resources we moved onto a wider discussion about how best tutors can help students to develop their academic writing.  I’ve made a note of two broad approaches; one is reactive, the other is proactive. A reactive strategy might include offering general backward looking feedback and perhaps running a one to one session with a student.  A proactive approach, on the other hand, could include discussions through a tutor group forum, activities within tutorials, sharing of hand outs that contain exercises and practical feed-forward advice within assignments that have been returned.

TMA feedback can, for example, give examples (or samples) of what is considered to be effective writing.  An important point that emerged from the discussions was that it is very important to be selective, since commenting on everything can be very overwhelming.  One approach is to offer a summary and provide useful links (and pointers) to helpful resources.

On-line tutorials

Anna moved onto the question of what tutors might (potentially) do within either face to face or on-line tutorials to help students with their academic writing; this was the part of the sessions where tutors had an opportunity to share practice with each other.  Anna also had a number of sample activities that we could either use, modify, or draw teaching inspiration from.

The first example was an activity where students had to choose key paragraphs from a piece of writing.  Students could then complete a ‘diagram’ to identify (and categorise) different parts (or aspects of an argument).  Another activity might be to ask students to identify question words, key concepts and the relationships between them. 

Further ideas include an activity to spot (or identify) parts of essay, such as an introductory sentences, background information, central claims and perhaps a conclusion.  A follow on activity might be to ask questions about purpose of each section, then connecting with a discussion to the tasks that are required for an assignment.

There was also a suggestion of using some cards.  Students could be asked to match important terms written on cards to paragraphs. Terms could include: appropriate tone, formality, alternative views, vocabulary, linking words, and so on.  There would also be an opportunity to give examples, to allow tutors to emphasise the importance of writing principles.

A further tip was to search the OpenLearn website for phrases such as ‘paraphrasing’ (or module codes, such as L185) for instance.  The OpenLearn site contains some very useful fragments of larger courses which might be useful to direct students to.

Aligning TMA feedback to students’ needs and expectations

This second session was facilitated by Concha Furnborough.  Her session had subheading of, ‘how well does our feedback work?’ which is a very important question to ask.  It soon struck me that this session was about the sharing of research findings with the intention of informing (and developing) tutor practice.

I’ve made a note of another question: how do we bridge the gap between actual and desired performance.  Connecting back to the previous session, a really important principle is to offer ‘feed-forward’ comments, which aims to guide future altering behaviour. 

An early discussion point that I noted was that some students don’t take the time to download their feedback (after they have discovered what their assignment marks were).  We were all reminded that we (as tutors) really need to take the time to make sure students download the feedback that they are entitled to receive.

This session describes some of the outcomes from a project called eFeP, which is an abbreviation for e-Feedback evaluation project, funded by Jisc (which support the use of digital technologies in education and research).  If you’re interested, more information about the project is available from the eFePp project website (Jisc).

The aim of the project was to understand the preferences and perceptions that students have about the auditory and written feedback that are offered by language tutors.  The project used a combination of different techniques.  Firstly, it used a survey.  The survey was followed by a set of interviews.  Finally, ten students were asked to make a screen-cast recording; students were asked to talk through their responses to the feedback and guidance offered by their tutors.

One of the most interesting parts of the presentation (for me) was a description of a tool known as ‘feedback scaffolding’.  The ‘scaffolding’ corresponds to the different levels or layers of feedback that are offered to students.  The first level relates to a problem or issue that exists in an assignment.  Level two relates to an identification of the type of error.  If we’re thinking in terms of language teaching, this might be the wrong word case (or gender) being applied.  The third level is where an error is corrected.  The fourth is where an explanation is given, and the fifth is clear advice on how performance might be potentially improved.

Feeling slightly disruptive, I had to ask a couple of questions.  Firstly, I asked whether there was a category where tutors might work to contextualise a particular assignment or question, i.e. to explain how it relates to the subject as a whole, or to explain why a question is asked by a module team.  In some respects, this can fall under the final category, but perhaps not entirely.

My second question was about when in their learning cycle students were asked to comment on their feedback.  The answer was that they gave their feedback once they had taken the time to read through and assimilate the comments and guidance that the tutors had offered.   Another thought would be to capture how feedback is understood the instant that it is received by a learner.  (I understand that the researchers have plans to carry out further research).

If anyone is interested, there is a project blog (OU website), and it’s also possible to download a copy of a conference paper about the research from the OU’s research repository.

Reflections

Even though I attended only two sessions, there was a lot to take in.  One really interesting point was to hear different views about the challenges of academic writing from different people who work in different parts of the university.  I’ve heard it said that academic writing (of the type of writing needed to complete TMAs) is very tough if you’re doing it for the first time.  In terms of raising awareness of different resources that tutors could use to help students, the first session was especially useful.

These conferences are not often used to disseminate research findings, but the material that was covered in the second session was especially useful.  It exposed us to a new feedback framework (that I wasn’t aware of), and secondly, it directly encouraged us to consider how our feedback is perceived and used.

One of the biggest benefits of these conferences is that they represent an opportunity to share practices.  A phrase that I’ve often heard is, ‘you always pick up something new’.

Copies of the presentations used during the conference can be found by visiting the South East Region conference resources page (OU website, staff only).

Footnote

A week after drafting this summary, I heard that the university plans to close the South East regional centre in East Grinstead.  I started with the South East region back in 2006, and it was through this region that I began my career as an associate lecturer.

All associate lecturers are offered two days of professional development as their contract, and the events that the region have offered have helped to shape, inform and inspire my teaching practice.  Their professional development events have helped me to understand how to run engaging tutorials, my comfort zone has also been thoroughly stretched through inspiring ‘role play’ exercises, and I’ve also been offered exceptional guidance about how to provide effective correspondence tuition.

Without a doubt, the region has had a fundamental and transformative effect on how I teach and has clearly influenced the positive way that I view my role as an associate lecturer.  The professional development has always been supportive, respectful and motivating.

The implications on the closure of the South East region on continuing professional development for both new and existing tutors is currently unclear.  My own view is probably one this obvious: if these rare opportunities for sharing and learning were to disappear, the support that the university offer its tutors would be impoverished.

Permalink Add your comment
Share post
Christopher Douce

e-Learning community event: mobile devices

Visible to anyone in the world
Edited by Christopher Douce, Thursday, 20 Feb 2014, 12:01

Mobile devices are everywhere.  On a typical tube ride to the regional office in London, I see loads of different devices.  You can easily recognise the Amazon Kindle; you see the old type with buttons, and the more modern version with its touch screen.  Other passengers read electronic books with Android and Apple tablets.  Other commuters study their smart phones with intensity, and I’m fascinated with what is becoming possible with the bigger screen phones, such as the Samsung Note (or phablets, as I understand they’re called).  Technology is giving us both convenience and an opportunity to snatch moments of reading in the dead time of travel.

I have a connection with a module which is all about accessible online learning (H810 module description).  In the context of the module, accessibility is all about making materials, products and tools usable for people who have disabilities.  Accessibility can also be considered in a wider sense, in terms of making materials available to learners irrespective of their situation or environment.  In the most recent presentation of H810, the module team has made available much of the learning materials in eBook or Kindle format.  The fact that materials can be made available in this format can be potentially transformative and open up opportunities to ‘snatch’ more moments of learning.

An event I attended on 11 February 2014, held in the university library, was all about sharing research and practice about the use of mobile devices.  I missed the first presentation, which was all about the use of OU Live (an on-line real time conferencing system) using tablet devices.  The other two presentations (which I’ve made notes about) explored two different perspectives: the perspective of the student, and the perspective of the associate lecturer (or tutor).

(It was also interesting to note that the event was packed to capacity; it was standing room only.  Mobile technology and its impact on learning seems to be a hot topic).

Do students study and learn differently using e-readers?

The first presentation I managed to pay attention to was by Anne Campbell who had conducted a study about how students use e-readers.  Her research question (according to my notes) was whether users of these devices could perform deep reading (when you become absorbed and immersed in a text) and active learning, or alternatively, do learners get easily distracted by the technology?  Active learning can be thought of carrying out activities such as highlighting, note taking and summarising – all the things that you used to be able to do with a paper based text book and materials.

Anne gave us a bit of context.  Apparently half of OU postgraduate students use a tablet or e-reader, and most use it for studying.  Also, half of UK households have some kind of e-reader.  Anne also told us that there was very little research on how students study and learn using e-readers.  To try to learn more, Anne has conducted a small research project to try to learn more about how students consume and work with electronic resources and readers.

The study comprised of seventeen students.  Six students were from the social sciences and eleven students were studying science.  All were from a broad range of ages.  The study was a longitudinal diary study.  Whenever students used their devices, they were required to make an entry.  This was complemented with a series of semi-structured interviews.  Subsequently, a huge amount of rich qualitative data was collected and then analysed using a technique known as grounded theory.   (The key themes and subjects that are contained within the data are gradually exposed by looking at the detail of what the participants have said and have written).

One of the differences between using e-readers and traditional text books is the lack of spatial cues.  We’re used to the physical size of a book, so it’s possible to (roughly) know where certain chapters are once we’re familiar with its contents.  It’s also harder to skim read with e-readers, but on the other hand this may force readers to read in more depth.  One comment I’ve noted is, ‘I think with the Kindle… it is sinking in more’.  This, however, isn’t true for all students.

I’ve also noted that there clear benefits in terms of size.  Some text books are clearly very heavy and bulky; you need a reasonably sized bag to move them around from place to place, but with an e-reader, you can (of course) transfer all the books that you need for a module to the device.  Other advantages are that you can search for key phrases using an e-reader.  I’ve learnt that some e-readers contain a built in dictionary (which means that readers can look up words without having to reach for a paper dictionary).  Other advantages include a ‘clickable index’ (which can help with the navigation).  Other more implicit advantages can include the ability to change the size of the text of the display, and the ability to use the ‘voice readout’ function of a mobile device (but I don’t think any participants used this feature).

I also noted that e-readers might not be as well suited for active learning for the reasons that I touched on above, but apparently it’s possible to perform highlights and to record notes within an ebook.

My final note of this session was, ‘new types of study advice needed?’   More of this thought later.

Perspectives from a remote and rural AL

Tamsin Smith, from the Faculty of Science, talked about how mobile technology helps her in her role as an associate lecturer.  I found the subject of this talk immediately interesting and was very keen to hear learn about Tamsin’s experiences.  One of the modules that Tamsin tutors on consists of seven health science books.  The size and convenience of e-readers can also obviously benefit tutors as well as students.

On some modules, key documents such as assignment guides or tutor notes are available as PDFs.  If they’re not directly available, they can be converted into PDFs using freely available software tools.  When you have got the documents in this format, you can access them using your device of choice.  In Tamsin’s case, this was an iPad mini. 

On the subject of different devices, Tamsin also mentioned a new app called OU Anywhere, which is available for both iOS and Android devices.  After this talk, I gave OU Anywhere a try, downloading it to my smartphone.  I soon saw that I could access all the core blocks for the module that I tutor on, along with a whole bunch of other modules.  I could also access videos that were available through the DVD that was supplied with the module.  Clearly, this appeared to be (at a first glance) pretty useful, and was something that I needed to spend a bit more time looking at.

Other than the clear advantages of size and mobility, Tamsin also said that there were other advantages.  These included an ability to highlight sections, to add notes, to save bookmarks and to perform searches.  Searching was highlighted as particularly valuable.  Tutors could, for example, perform searches for relevant module materials during the middle of tutorials. 

Through an internet connection, our devices can allow access to the OU library, on line tutorials through OU Live (as covered during the first presentation that I missed), and tutor group discussion forums allowing tutors to keep track of discussions and support students whilst they’re on the move.  This said, internet access is not available everywhere, so the facility to download and store resources is a valuable necessity.  This, it was said, was the biggest change to practice; the ability to carry all materials easily and access them quickly. 

One point that I did learn from this presentation is that there is an ETMA file handler that available for the iPad (but not one that is official sanctioned or supported by the university).

Final thoughts

What I really liked about Anne’s study was its research approach.  I really liked the fact that it used something called a diary study (which is a technique that is touched on as a part of the M364 Interaction Design module).  This study aimed to learn how learning is done.  It struck me that some learners (including myself) might have to experiment with different combinations of study approaches and techniques to find out what works and what doesn’t.  Study technique (I thought) might be a judgement for the individual.

When I enrolled on my first postgraduate module with the Open University, I was sent a book entitled, The Good Study Guide by Andrew Northedge (companion website).  It was one of those books where I thought to myself, ‘how come it’s taken me such a long time to get around to reading this?’, and, ‘if only I had read this as an undergraduate, I might have perhaps managed to get a higher score in some of my exams’.  It was packed filled with practical advice about topics as time management, using a computer to study, reading, making notes, writing and preparing for exams.

It was interesting to hear from Anne’s presentation that studying using our new-fangled devices is that little bit different.  Whilst on one hand we lose some of our ability to put post it notes between pages and see where our thumbs have been, we gain mobility, convenience and extra facilities such as searching. 

It is very clear that more and more of university materials can now be accessed using electronic readers.  Whilst this is likely to be a good thing (in terms of convenience), there are two main issues (that are connected to each other) that I think that we need to bear in mind. 

The first is a very practical issue.  It is: how do you get the materials onto our device?  Two related questions are: how can we move our materials between different devices? and, how do we effectively manage the materials once we have saved them to our devices?  We might end up downloading a whole set of different files, ranging from different module blocks, assignments and other guidance documents.  It’s important to figure out a way to best manage these files:  we need to be literate in how we use our devices.   (As an aside, these questions loosely connect with the nebulous concept of the Personal Learning Environment).

The second issue relates to learning.  In the first presentation, Anne mentioned the term ‘active learning’.  The Good Study Guide contains a chapter about ‘making notes’.  Everyone is different, but I can’t help but think that there’s an opportunity for ‘practice sharing’.  What I mean is that there’s an opportunity to share stories of how learners can effectively make use of these mobile devices, perhaps in combination with more traditional approaches for study (such as note taking and paraphrasing).  Sharing tips and tricks about how mobile devices can fit into a personalised study plan has the potential to show how these new tools can be successfully applied.

A final thought relates to the broad subject of learning design.  Given that half of all households now have access to e-readers of one form or another (as stated in the first presentation I’ve covered) module teams need to be mindful of the opportunities and challenges that these devices can offer.  Although this is slightly away from my home discipline and core subject, I certainly feel that there needs to be work to be done to further understand what these challenges and opportunities might be.  I’m sure that there has been a lot more work carried out than I am aware of.  If you know of any studies that are relevant, please feel free to comment below.

Video recordings of these presentations are available through the university Stadium website.

Permalink 1 comment (latest comment by Jonathan Vernon, Wednesday, 5 Mar 2014, 23:38)
Share post
Christopher Douce

Gresham College: Designing IT to make healthcare safer

Visible to anyone in the world

On 11 February, I was back at the Museum of London.  This time, I wasn’t there to see juggling mathematicians (Gresham College) talking about theoretical anti-balls.  Instead, I was there for a lecture about the usability and design of medical devices by Harold Thimbleby, who I understand was from Swansea University. 

Before the lecture started, we were subjected to a looped video of a car crash test; a modern car from 2009 was crashed into a car built in the 1960s.  The result (and later point) was obvious: modern cars are safer than older cars.  Continual testing and development makes a difference.  We now have substantially safer cars.  Even though there have been substantial improvements, Harold made a really interesting point.  He said, ‘if bad design was a disease, it would be our 3rd biggest killer’.

Computers are everywhere in healthcare.  Perhaps introducing computers (or mobile devices) might be able to help?  This might well be the case, but there is also the risk that hospital staff might end up spending more time trying to get technology to do the right things rather than spending other time dealing with more important patient issues.  There is an underlying question of whether a technology is appropriate or not.

This blog post has been pulled directly from my notes that I’ve made during the lecture.  If you’re interested, I’ve provided a link to the transcript of the talk, which can be found at the end.

Infusion pumps

Harold showed us pictures of a series of infusion pumps.  I didn’t know what an infusion pump was.  Apparently it’s a device that is a bit like an intravenous drip, but you program it to dispense a fluid (or drug) into the blood stream at a certain rate.  I was very surprised by the pictures: every infusion pump looked very different to each other and these differences were quite shocking.  They each had different screens and different displays.  They were different sizes and had different keypad layouts.  It was clear that there was little in the way of internal and external consistency. Harold made an important point, that they were ‘not designed to be readable, they were designed to be cheap’ (please forgive my paraphrasing here).

We were regaled with further examples of interaction design terror.  A decimal point button was placed on an arrow key.  It was clear that there was not appropriate mapping between a button and its intended task.  Pushing a help button gave little in the way of help to the user.

We were told of a human factors analysis study where six nurses were required to use an infusion pump over a period of two hours (I think I’ve noted this down correctly).  The conclusion was that all of the nurses were confused.  Sixty percent of the nurses needed hints on how to use the device, and a further sixty percent were confused by how the decimal point worked (in this particular example).  Strikingly, sixty percent of those nurses entered the wrong settings.  

We’re not talking about trivial mistakes here; we’re talking about mistakes where users may be fundamentally confused by the appearance and location of a decimal point.   Since we’re also talking about devices that dispense drugs, small errors can become life threateningly catastrophic.

Calculators

Another example of devices where errors can become significant is the common hand-held calculator.  Now, I was of the opinion that modern calculators were pretty idiot proof, but it seems that I might well be the idiot for assuming this.  Harold gave us an example where we had to try to simply calculate percentages of the world population.  Our hand held calculator simply threw away zeros without telling us, without giving us any feedback.  If we’re not thinking, and since we implicitly know that calculators carry out calculations correctly, we can easily assume that the answer is correct too.  The point is clear:  ‘calculators should not be used in hospitals, they allow you to make mistakes, and they don’t care’.

Harold made another interesting point: when we use a calculator we often look at the keypad rather than the screen.  We might have a mental model of how a calculator works that is different to how it actually responds.   Calculators that have additional functions (such as a backspace, or delete last keypress buttons) might well break our understanding and expectations of how these devices operate.  Consistency is therefore very important (along with the visibility of results and feedback from errors).

There’s was an interesting link between this Gresham lecture and the lecture by Tony Mann (blog summary), which took place in January 2014.  Tony made the exact same point that Harold did.  When we make mistakes, we can very easily blame ourselves rather than the devices that we’re using.  Since we hold this bias, we’re also reluctant to raise concerns about the usability of devices and the equipment that we’re using.

Speeds of Thinking

Another interesting link was that Harold drew upon research by Daniel Kahneman (Wikipedia), explicitly connecting the subject of interface design with the subject of cognitive psychology.  Harold mentioned one of Kahneman’s recent books entitled: ‘Thinking Fast and Slow’, which posits that there are two cognitive systems in the brain: a perceptual system which makes quick decisions, and a slower system which makes more reasoned decisions (I’m relying on my notes again; I’ve got Daniel’s book on my bookshelves, amidst loads of others I have chalked down to read!)

Good design should take account of both the fast and the slow system.  One really nice example was with the use of a cashpoint to withdraw money from your bank account.  Towards the end of the transaction, the cashpoint begins to beep continually (offering perceptual feedback).  The presence of the feedback causes the slower system to focus attention on the task that has got to be completed (which is to collect the bank card).   Harold’s point is simple: ‘if you design technology properly we can make the world better’.

Visibility of information

How do you choose one device or product over another?  One approach is to make usually hidden information more visible to those who are tasked with making decisions.  A really good example of this is the energy efficiency ratings on household items, such as refrigerators and washing machines.  A similar rating scheme is available on car tyres too, exposing attributes such as noise, stopping distance and fuel consumption.  Harold’s point was: why not create a rating system for the usability of devices?

Summary

The Open University M364 Fundamentals of Interaction Design module highlights two benefits of good interaction design.  These are: an economic arguments (that good usability can save time and money), and safety.

This talk clearly emphasised the importance of the safety argument and emphasised good design principles (such as those created by Donald Norman), such as visibility of information, feedback of action, consistency between and within devices, and appropriate mapping (which means that buttons that are pressed should do the operation that they are expected to do).

Harold’s lecture concluded with a number of points that relate to the design of medical devices.  (Of which there were four, but I’ve only made a note of three!)  The first is that it’s important to rigorously assess technology, since this way we can ‘smoke out’ any design errors and problems (evaluation is incidentally a big part of M364).  The second is that it is important to automate resilience, or to offer clear feedback to the users.  The third is to make safety visible through clear labelling.

It was all pretty thought provoking stuff which was very clearly presented.  One thing that struck me (mostly after the talk) is that interactive devices don’t exist in isolation – they’re always used within an environment.  Understanding the environment and the way in which communications occur between different people who work within that environment are also considered to be important too (and there are different techniques that can be used to learn more about this).

Towards the end of the talk, I had a question that someone else asked.  It was, ‘is it possible to draw inspiration from the aviation industry and apply it to medicine?’  It was a very good question.  I’ve read (in another OU module) that an aircraft cockpit can be used as a way to communicate system state to both pilots.  Clearly, this is subject of on-going research, and Harold directed us to a site called CHI Med (computer-human interaction).

Much food for thought!  I came away from the lecture feeling mildly terrified, but one consolation was that I had at least learnt what an infusion pump was.  As promised, here’s a link to the transcript of the talk, entitled Designing IT to make healthcare safer (Gresham College). 

Permalink Add your comment
Share post
Christopher Douce

Bletchley Park archive course

Visible to anyone in the world

At end of January, I took a day off my usual duties and went to an event called the ‘Bletchley Park archive course’.  I heard about the course through the Bletchley Park emailing list.  As soon as I received the message telling me about it I contacted the organisers straight away, but unfortunately, I was already too late: there were no longer any spaces on the first event.  Thanks to a kind hearted volunteer, I was told about the follow up event.

This blog post is likely to be a number of blog posts about Bletchley Park, a place that is significant not only in terms of Second World War intelligence gathering and analysis, but is also significant in the history of computing.   It’s a place I’ve been to a couple of times, but this visit had a definite purpose; to learn more about their archives and what they might be able to tell a very casual historian of technology, like myself.

I awoke at about half six in the morning, which is the usual time when I have to travel to Milton Keynes and found my way to my local train station.  The weather was shocking, as it was for the whole of January.  I was wearing sturdy boots and had donned a raincoat, as instructed by the course organisers.  Two trains later, I was at Euston Station, ready to take the relatively short journey north towards Milton Keynes, and then onto the small town of Bletchley, just one stop away.

Three quarters of an hour later, after walking through driving rain and passing what appeared to be a busy building site, I had found the room where the ‘adult education’ course was to take place.

Introduction and History

The day was hosted by Bletchley Park volunteer, Susan Slater.  Susan began by taking about the history of the site that was to ultimately become a pivotal centre for wartime intelligence.  Originally belonging to a financier, the Bletchley Park manor house and adjoining lands were put up for auction in 1937. 

Bletchley was a good location; it was pretty incongruous.  It was also served by two railway lines.  One line that went to London and another that went from East to West, connecting the universities of Oxford and Cambridge.  Not only was it served well in terms of transport, the railway also offers other kinds of links too – it was possible to connect to telecommunication links that I understand ran next to the track.  Importantly, it was situated outside of London (and away from the never ending trials of the blitz).

Susan presented an old map and asked us what we thought it was.  It turned out to be a map of the telegraph system during the time of the British Empire; red wires criss-crossed the globe.  The telegraph system can be roughly considered to be a ‘store and forward’ system.  Since it was impossible (due to distances involved) to send a message from England to, say, Australia, directly, messages (sent in morse code) were sent via a number of intermediate stations (or hubs). 

Susan made the point that whoever ran the telecommunication hubs were also to read all the messages that were transferred through it.  If you want your communications to be kept secret, the thing to do is to encode them in some way.  Interestingly, Susan also referred to Edward II, where there was a decree in around 1324 (if I understand this correctly!) that stated ‘all letters coming from or going to parts overseas [could] be ceased’.  Clearly, the contemporary debates about the interception of communications have very deep historical roots.

We were introduced to some key terms.  A code is a representation of letters and words by other letters and words.  A cypher is how letters are replaced with other letters.  I’ve also noted that if that if something is formulaic (or predicable), then it can become breakable (which is why you want to hide artefacts of language - certain characters in a language are statistically more frequent than others, for example).  The most secure way to encode a message is to use what a one-time pad (Wikipedia).  This is an encoding mechanism that is used only once and then thrown away.

An Engima machine (Wikipedia), which sat at the front of the classroom, was an electro-mechanical implementation of an encoding mechanism.  Susan outlined its design to us: it had a keyboard like a typewriter, plug boards (to replace one letter with another), four or five rotors that had the same number of positions as there were characters (which moved every time you pressed a key), and wiring within the rotors that changed the ‘letters’ even further. 

Second session: how it all worked

After a swift break, we dived straight into the second session, where we were split into two teams.  One team had to encrypt a message (using the Enigma machine), and the second team had to use the same machine to decrypt the same message (things were made easier since the ‘decrypting side’ knew what all the machine settings were).   I think my contribution was to either press a letter ‘F’ or a letter ‘Q’ – I forget!  Rotors turned and lights lit up.  The seventy-something year old machine still did its stuff.

What follows is are some rough notes from my notebook (made quickly during the class).  We were told that different parts of the German military used different code books (and also the Naval enigma machine was different to other enigma machines).  Each code book lasted for around 6 weeks.  The code book contained information such as the day, rotor position, starting point of the rotor and plug board settings; everything you needed to make understandable messages totally incomprehensible.

The challenge was, of course, to uncover what the settings of an Engima machine were (so messages could be decrypted).  A machine called the Bombe (Wikipedia) was invented to help with the process of figuring what the settings might be.  When the settings were (potentially) uncovered, these were tested by entering them into a machine called the Typex (which was, in essence, a version of an Enigma machine) along with the original message, to see if plain text (an unencrypted message) appeared.

The Enigma wasn’t the only machine that was used to encrypt (and decrypt) messages. Enigma (as far as I understand) was used for tactical communications.  Higher level strategic communications used in the German high command were transmitted using the Lorenz cypher.  This more complicated machine contained a paper tape reader which allowed the automatic transmission of messages, dispensing with the need for a morse code operator.

In terms of the scale of the operation at Bletchley Park, we were told that three thousand Engima messages ever day were being decoded, and forty Lorenz messages.  To help with this, there were 210 Bombe machines to help with the Enigma codes, and a machine that is sometimes described as ‘the world’s first electronic computer’, the Colossus machine.  At its peak, there were apparently ten thousand workers (a quarter of whom were women), running three shifts. 

Bombe Demo

After a short break, we were gently ushered downstairs to one of the museum exhibits; a reconstruction of a Bombe machine.  This was an electro-mechanical device that ‘sped up’ the process of discovering Enigma machine settings.  Two operators described how it worked and then turned it on.  It emitted a low whirring and clicking noise as it mechanically went through hundreds of combinations.

As the Bombe was running, I had a thought.  I wondered how you might go about writing a computer program, or a simulation to do pretty much the same thing.  The machine operators talked about the use of something called a ‘code map’, which helped them to find the route towards the settings.  I imagined an application or interactive smartphone or tablet app that allowed you to play with your own version of a Bombe, to get a feel for how it would work...  There could even be virtual Enigma machine that you could play with; you could create a digital playground for budding cryptographers.

Of course, there’s no such thing as an original thought: a Bombe simulator has already been written by the late Tony Sale (who reconstructed the Colossus machine), and a quick internet search revealed a bunch of Engima machine simulators.  One burning question is how might we potentially make the best use of these tools and resources?

Archive Talk

The next part of the day was all about the archive; the real reason I signed up for this event.  I have to confess that I didn’t really know what to expect and this sense of uncertainty was compounded by having a general interest rather than having a very specific research question in mind.

The archive is run by the Bletchley Park Trust.  GCHQ, the Government Communication Headquarters, is the custodian for the records that have come from Bletchley Park.  I understand that GCHQ is going to use Bletchley Park is used as its ‘reading room’, having leant around one hundred and twenty thousand documents for a period of fifty years.

By way of a very general introduction, a number of samples from the archive were dotted around our training room.  These ranged from Japanese language training aids (and a hand-written Japanese-English dictionary), forms used to help with the decryption of transmissions, through to samples of transmissions that were captured during the D-Day landings.

Apparently, there’s a big project to digitise the archive.  There is a multi-stage process that is under way.  The first stage is to have the artefacts professionally photographed.  This is followed by (I believe) storing the documents in some kind of on-line repository.  Volunteers may then be actively needed to help create metadata (or descriptions) of each repository item, to enable them to be found by researchers.

Tour

The final part of the day was a tour.  As I mentioned earlier, I’ve been on a couple of Bletchley Park tours, but this was unlike any of the earlier tours I had been on before.  We were all given hard hats and told to don high visibility jackets.  We were then ushered into the driving rain.

After a couple of minutes of trudging, we arrived at a building that I had first seen when I entered the site.  The building (which I understand was known as ‘hut 3’) was to become a new visitor’s centre.  From what I remember, the building used to be one of the largest punched card archives in Europe, known as Deb’s delight (for a reason that completely escapes me).    It was apparently used to cross-reference stuff (and I’m writing in terrible generalisations here, since I really don’t know very much!) 

Inside, there was no real lighting and dust from work on the floors hung in the air.  There was a strong odour of glue or paint.  Stuff was clearly happening.  Internal walls had been stripped away to give way to reveal a large open plan area which would become an ideal exhibition space.  Rather than being a wooden prefabricated ‘hut’, we were walking through a substantial brick building. 

Minutes later, we were directed towards two other huts that were undergoing restoration.  These were the wooden ones.  It was obvious that these buildings had lacked any kind of care and attention for many years, and workmen were busy securing the internal structure.  Avoiding lights and squeezing past tools, we snaked through a series of claustrophobic corridors, passing through what used to be the Army Intelligence block and then onto the Navy Intelligence block.  These were the rooms in which real secrets became clear.   Damp hung in the air, and mould could be seen creeping up some of the old walls.  There was clearly a lot of work that needed to be done.

Final thoughts

Every time I visit Bletchley Park, I learn something new.  This time, I became more aware of what happened in the different buildings, and I certainly learnt more about the future plans for the archive.  Through the talks that took place at the start of the day, I also learnt of a place called the Telegraph museum (museum website), which can be found at Porth Curno, Cornwall.   When walking through the various corridors to the education room, I remember a large poster that suggested that all communication links come to Bletchley Park, and that Bletchley is the centre of everything.

When it comes to a history of computing, it’s impossible to separate out the history of the computer and the history of telecommunications.  In Bletchley Park, communications and computing are fundamentally intertwined.  There’s another aspect, which is computing (and computing power) has led to the obvious development of new forms of communication.  Before I go any further forward in time (from, say, 1940 onwards), there’s a journey that I have to make back in time, and that is to go on a diversion to discover more about telecommunications, and a good place to start is by learning more about the history of the telegraph system.

I’ll be back another day (ideally when it’s not raining), to pay another call to Bletchley Park, and will also drop into to The National Museum of Computing, which occupies the same site.

Permalink 1 comment (latest comment by Rebecca Kowalski, Thursday, 13 Feb 2014, 14:52)
Share post
Christopher Douce

Gresham College Lecture: Notations, Patterns and New Discoveries (Juggling!)

Visible to anyone in the world

On a dark winter’s evening on 23 January 2014, I discovered a new part of London I had never been to before.  Dr Colin Wright gave a talk entitled ‘notations, patterns and new discoveries’ at the Museum of London.   The subject was intriguing in a number of different ways.  Firstly, it was all about the mathematics of juggling (which represented a combination of ideas that I had never come across before).  Secondly, it was about notations.

 The reason why I was ‘hooked’ by the notation part of the title is because my home discipline is computer science.  Computers are programmed using notation systems (programming languages), and when I was doing some research into software maintenance and object-oriented programming I discovered a series of fascinating papers that was about something called the ‘cognitive dimensions of notations’.  Roughly put, these were all about how we can efficiently work with (and think about) different types of notation system.

In its broadest sense, a notation is an abstraction or a representation.  It allows us to write stuff down.  Juggling (like dance) is an activity that is dynamic, almost ethereal; it exists and time and space, and then it can disappear or stop in an instant.  Notation allows us to write down or describe the transitory.  Computer programming languages allow us to describe sets of invisible instructions and sequences of calculations that exist nowhere except within digital circuits.  When we’re able to write things down, it turns out that we can more easily reason about what we’ve described, and make new discoveries too.

It took between eight and ten minutes to figure out how to get into the Museum of London.  It sits in the middle of a roundabout that I’ve passed a number of times before.  Eventually, I was ushered into a huge cavernous lecture theatre, which clearly suggested that this was going to be quite ‘an event’.  I was not to be disappointed.

Within minutes of the start of the lecture, we heard names of famous mathematicians: Gauss and Liebniz.  One view was that ‘truths (or proofs) should come from notions rather than notations’.  Colin, however, had a different view, that there is interplay between notions (or ideas) and notations.

During the lecture, I made a note of the following sentence: a notation represents a ‘specialist terminology allows rapid and accurate communication’, and then moved onto ask the question, ‘how can we describe a juggling pattern?’  This led to the creation of an abstraction that could then describe the movement of juggling balls. 

Whilst I was listening, I thought, ‘this is exactly what computer programmers do; we create one form of notation (a computer program), using another form of notation (a computer language) – the computer program is our abstraction of a problem that we’re trying to solve’.  Colin introduced us to juggling terms (or high level abstractions), such as the ‘shower’, ‘cascade’ and ‘mill’s mess’.  This led towards the more intellectually demanding domain of ‘theoretical juggling’ (with impossible number of balls).

 My words can’t really do the lecture justice.  I should add that it is one of those lectures that you would learn stuff by listening to it more than once.  Thankfully, for those who are interested, it was recorded, and it available on-line (Gresham College)

Whilst I was witnesses all these great tricks, one thought crossed my mind, which was, ‘how much time did you have to spend to figure out all this stuff and to learn all these juggling tricks?!  Surely there was something better you could have done with your time!’ (Admittedly, I write this partially in jest and with jealousy, since I can’t catch and I fear that doing ‘a cascade’ with three balls is, for me, a theoretical impossibility). 

It was a question that was implicitly answered by considering the importance of pure mathematics.  Doing and exploring stuff only because it is intellectually interesting may potentially lead to a real world practical use – the thing is that you don’t know what it might be and what new discoveries might emerge.  (A good example of this is number theory leading to the practical application of cryptography, which is used whenever we buy stuff over the internet). 

All in all, great fun.  Recommended.

Permalink Add your comment
Share post
Christopher Douce

Gresham College Lecture: User error – why it’s not your fault

Visible to anyone in the world

On 20 January 2014 I found the time to attend a public lecture in London that was all about usability and user error. The lecture was presented by Tony Mann, from the University of Greenwich.  The event was in a group of buildings just down the street from Chancery Lane underground station.  Since I was keen on this topic, I arrived twenty minutes early only to find that the Gresham College lecture theatre was already full to capacity.  User error (and interaction design), it seems, was apparently a very popular subject!

One phrase that I’ve made a note of is that ‘we blame ourselves if we cannot work something’, that we can quickly acquire feelings of embarrassment and incompetence if we do things wrong or make mistakes.  Tony gave us the example that we can become very confused by the simplest of devices, such as doors. 

Doors that are well designed should tell us how they should be used: we rely on visual cues to tell us whether they should be pushed or pulled (which is called affordance), and if we see a handle, then we regularly assume that the door should be pulled (with is our application of the design rule of ‘consistency’).  During this part of Tony’s talk, I could see him drawing heavily on Donald Norman’s book ‘The psychology of everyday things’ (Norman’s work is also featured within the Open University module, M364 Fundamentals of Interaction design).

I’ve made a note of Tony saying that when we interact with systems we take information from many different sources, not just the most obvious.  An interesting example that was given was the Kegworth air disaster (Wikipedia), which occurred since the pilot had turned off the wrong engine, after drawing from experience gained from different but similar aircraft.

Another really interesting example was the case where a pharmacy system was designed to in such a way that drug names could only be 24 characters in length and no more.  This created a situation where different drugs (which had very similar names, but had different effects) could be prescribed by a doctor in combinations which could potentially cause fatal harm to patients.  Both of these examples connect perfectly to the safety argument for good interaction design.  Another argument (that is used in M364) is an economic one, i.e. poor interaction design costs users and businesses both time and money.

Tony touched upon further issues that are also covered in M364.  He said, ‘we interact best [with a system] when we have a helpful mental model of a system’, and our mental models determine our behaviour, and humans (generally) have good intuition when interacting with physical objects (and it is hard to discard the mental models that we form).

Tony argued that it is the job of an interaction designer to help us to create a useful mental model of how a system works, and if there’s a conflict (between what a design tells us and how we think something may work), we can very easily get into trouble very quickly.  One way to help with is to make use of metaphor.  Tony Mann: ‘a strategy is to show something that we understand’, such as a desktop metaphor or a file metaphor on a computer.  I’ve also paraphrased the following interesting idea, that a ‘designer needs to both think like a computer and think like a user’.

One point was clearly emphasised: we can easily choose not to report mistakes.  This means that designers might not always receive important feedback from their users.  Users may to easily think, ‘that’s just a stupid error that I’ve made…’  Good designs, it was argued, prevents errors (which is another important point that is addressed in M364).  Tony also introduced the notion of resilience strategies; things that we do to help us to avoid making mistakes, such as hanging our scarf in a visible place so we remember to take it home after we’ve been somewhere.

The three concluding points were: we’re always too ready to blame ourselves when we make a blunder, that we don’t help designers as often as we ought to, and that good interaction design is difficult (because we need to consider different perspectives).

Tony’s talk touched upon wider (and related) subjects, such as the characteristics of human error and the ways that systems could be designed to minimise the risk of mistakes arising.  If I were to be very mean and offer a criticism, it would be that there was perhaps more of an opportunity to talk about the ‘human’ side of error – but here we begin to step into the domain of cognitive psychology (as well as engineering and mathematics).  This said, his talk was a useful and concise introduction to the importance of good interaction design.

Permalink Add your comment
Share post
Christopher Douce

Interaction design and user experience for motorcyclists

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 9 Feb 2014, 16:04

Has anyone ever uttered the following phrases:  ‘it must be me!’ or ‘I must be stupid, I can’t work this system!’  When you say those words the odds are that it is likely that the problems have little to do with you and have everything to do with the system that you’re trying to use.

Making usable systems and devices is all about understanding different perspectives and thinking about compromises.  Firstly, there’s the user (and understanding what he or she wants to do using a system).  Secondly, there’s the task that has to be completed (and how a task might be connected to other tasks and systems).  Finally, there’s the question of the environment, i.e. the situations in which a product is going to be used.  If you fully understand all these aspects in a lot of depth and balance one aspect against another, then you’ll be able to design a system that is usable (of course, this is a huge simplification of the process of interaction design, but I’m sure that you get my point).

Parking a motorbike

A couple of months ago took a course at my second favourite academic institution, CityLit.  Since it was pretty good weather (despite being January), I decided to ride my scooter into the middle on London and park in one of the parking bays that were not too far from the college.  The only problem was that the City of Westminster has introduced a charging scheme, and this was a system that I hadn’t used before.

This blog post is a polite rant (and reflection) of the banal challenge of trying to pay Westminster council a grand total of one pound and twenty pence.  It turns out that the whole exercise is an interesting example of interaction design since it helps us to think about issues surrounding who the user is, the environment in which a system is used and the task that has to be completed.  Paying for parking sounds like a pretty simple task, doesn’t it?  Well, let me explain…

Expecting trouble

Having heard about the motorcycle parking rules in Westminster, I decided to do some research.  I expecting a simple system where you texted your bike registration number and location code to a designated ‘parking’ telephone number, and through the magic of mobile telephony, one English pound was added to your monthly mobile phone bill and the same English pound was appropriated to Westminster Council.  Well, it turned out to be a bit more complicated than that.  Payments don’t come from your phone account but instead come from your credit card.  This means that you needed to connect your phone number to your credit card number.

When you’ve found the motorbike registration site (which isn’t through a recognisable ‘Westminster Council’ URL), you get to create something called a ‘parking account’.  When logged in, you’re asked to enter the registration number of your vehicle.  In my case, since I’m pretty weird, I have two motorbikes: one that makes the inside of the garage look pretty, and another one (a scooter) that I sometimes use to zip around town on.   There are enough spaces to enter the registration codes for four different bikes. 

The thing is, I can’t remember the registration numbers for any of my bikes!  It turns out that I can hardly remember anything!  I can’t remember my phone number, I can’t remember my credit card number and I can’t remember two registration numbers.  I must be an idiot!  (Thankfully, I remembered my email address, which is something else you need – just make sure you know the password to access your account).

There was another oddity of the whole system.  After you’ve got an account, you login using a PIN code, which is the last four numbers of your credit card.  I never use these four numbers!  Again, I don’t know what they are! (Unless I had to look).  I was starting to get a bit impatient.

Arriving at the parking bay

The ride to the middle of town was great.  It was too early in the day for most people, which meant that the streets were quiet.  After parking my bike, I started to figure out how to pay.  I looked at an information sign, which I immediately saw was covered in city grime.  I also immediately saw that it didn’t have all the information I needed. 

I visited the parking website and discovered that you needed FOUR different numbers!  You needed a phone number, a location number (where your bike is parked), a day code (to indicate how long you’re parking your bike for), and the final four numbers of your registered credit card.  Thankfully, I had the foresight to save the parking telephone number in my phone, so I only had to send three numbers (but I would have rather liked to avoid messing around with my wallet to fish out my credit card; it meant unzipping and then zipping up layers of protective clothing).

Coffee break

At last, I had done it.  I had sent a payment text.  To celebrate my success, I visited a nearby café for a coffee and a sit down.  About ten minutes later, I received a text message that confirmed that I had paid for parking ‘FOR THE WRONG BIKE!’ 

The text message confirmed that I had just paid for parking for my ridiculous bike rather than the sensible city scooter that I had just used.  Also, when I registered both bikes on the system, I entered the scooter registration first, since it would be the bike that I would be using most.  At this point, I had no idea whether the system was clever enough to stupidly assume that I had written either (or both) of my bikes to Westminster at the same time.  There was no clear way to choose one bike as opposed to the other.  Again, I felt like an idiot.

Then, I had a crazy thought – perhaps I ought to try to look at my ‘parking record’, since this way there might be a way to change the vehicle I was using.  I logged in to the magic system (through my smartphone), entering in my last four digits of my credit card, again, and found a screen that seemed to do what I wanted.  It encouraged me to enter start and end dates (what?), and then had a button entitled, ‘generate report’.  A report on what?  The number of toys found in Kinder Eggs that are considered to be dangerous?  I pushed the button.  Nothing happened.  I had no parking history despite having just sent a parking text.  Effective feedback is one of the most obvious and fundamental principles of good usability.

Chat

It took be around five minutes to walk to the college.  When I got there I discovered two other motorcycle parking bays that were just around the corner.  I then made a discovery: it seemed that different bays seemed to have the same location ID.  It then struck me: perhaps the second number I had been entering in the phone was totally redundant!  Perhaps it’s the same code that is used all over London!

 During my class I got chatting to a fellow biker.  After I had emoted about the minor trauma of trying the pay for the parking, my new biker friend said, ‘there’s an app for this…’  Again, I thought ‘why didn’t anyone tell me!’  So, during a break I found the right app and started a download.  After a couple of minutes of nothing happening, I was presented with the delightful message:  ‘Error downloading: 504’.

Final thoughts

A really good interaction design principle is that you should always try to design systems which minimise what users need to remember (there’s this heuristic that has the title ‘visibility of system status’).   On this system, you needed to remember loads of different numbers and codes.  The task is pretty simple.  There is a fixed fee.  The only variable that you might want to enter is either the length of the stay (in days) and the choice of the vehicle.  But what happens if your phone runs out of charge and you want to use a friends phone to pay?  You’ll then have to make a telephone call with an operator, all for the sake of one pound twenty.

There’s also the environment to contend with.  I had to take gloves off, fumble around in my pockets for my mobile phone and then enter numbers.  The information sign was pretty small (and I can’t remember it mentioning anything about using an app).  I dread to think how difficult the process is if English isn’t your first language, and you don’t know that Westminster has bike parking fees.

One final thought is that one approach to learning more about the user experience is to observe users in the things that they do.  This is an approach that has drawn heavily from the social sciences, and on Open University modules such as M364 Interaction Design, subjects and techniques such as Ethnography are introduced.  Another approach to learning about user successes and failures is to search on-line, to learn about the problems other people have experienced.  Although this isn’t explicitly covered in M364, it is an interesting technique.

All this said, the second time that I needed to pay, I used the ‘pay by phone’ parking app.  The ‘504’ error message that I wrote about earlier had miraculously disappeared (why not a message that says, ‘please try again later?) and I was able to download the app and then press a couple of on-screen (virtual) buttons and enter in the last four numbers of my credit card (again, a number that I haven’t yet memorise, since no other system asks me for it…).  I even managed to pay for the right bike, this time!

Permalink Add your comment
Share post
Christopher Douce

MOOCs - What the research says

Visible to anyone in the world
Edited by Christopher Douce, Friday, 3 Jan 2014, 10:23

On 29 November 2013 I bailed out of the office and went to an event a place called  the London Knowledge Lab to attend a dissemination event about MOOCs.  Just in case you’re not familiar with the term, a MOOC (Wikipedia) is an abbreviation for Massively Open On-line Courses. The London Knowledge Lab was a place that I had visited a few years ago to attend an event about e-assessment (blog post).

This post is a quick, overdue, summary of the event.  For those who are interested, the London Knowledge Lab has provided a link to a number of pages (Institute of Education) that summarises some of the presentations in a lot more detail.  

Introductions

The event started with a brief presentation by Diana Laurillard, entitled The future potential of the MOOC.  During Diana’s presentation, I noted down a number of points that jumped out at me.

An important question to ask is what problems is a MOOC going to solve?  Diana mentioned a UNESCO goal (UNESCO website) that states that every child should have access to compulsory education by 2015.  It’s also important to note that there is an increasing demand for higher education but in the sector, the current model is that there is 1 member of staff for every 25 students.  If the objective is to reach as many people as possible, we’re immediately faced with some fundamental challenges. One thought is that perhaps MOOCs might be able to help with the demand for education.

But why should an institution create a MOOC in the first place?  There are a number of reasons.  Firstly, a MOOC offers a taster of what you might expect as a particular course of study, it has the potential to enhance or sustain the reputation of an institution that provides (or supports) a MOOC, offers an opportunity to carry out research and development within the intersection between information technology and education.  One of the fundamental challenges include how to best create a sustainable business model.

A point to bear in mind is that there hasn’t (yet) been a lot of research about MOOCs.   Some MOOCs clearly attract a certain demographic, i.e. professionals who already have degrees; this was a point that was echoed a number of times throughout the day.

Presentations

The first presentation of the day was by Martin Hawksey who talked about a MOOC ran by the Association of Learning Technology (ALT website).  I made a note that it adopting a ‘connectivist’ model (but I’m not quite sure I know what this means), but it was clear that different types of technology were used within this MOOC, such as something called FeedWordPress (which appears to be a content aggregator).

Yishay Mor, from the Open University Institute of Educational Technology spoke about a MOOC that was all about learning design.  I’ve made a note that his MOOC adopted a constructionist (Wikipedia) approach.  This MOOC used a Google site as a spine for the course, and also use an OU developed tool called CloudWorks (OU website) to facilitate discussions.

Yishay’s tips about what not to do include: don’t use homebrew technology (since scaling is iimportant), don’t assume that classroom experiences work on a MOOC, from the facilitators perspective the amount of interactions can be overwhelming.  An important note is that scaling might mean (in some instances), moving from a mechanical system to a dynamic system.

The third presentation of the day was by Mike Sharples who was also from the Open University.   Mike also works as an academic lead for FutureLearn, a UK based MOOC that was set up as a partnership between the Open University and other institutions.  At the time of his presentation, FutureLearn had approximately 50 courses (or MOOCs?) running.

I’ve noted that the pedagogy is described as ‘a social approach to online learning’ and Mike mentioned the term social constructivism.  I’ve also made a note that Laurillard’s conversational framework was mentioned, and ‘tight cycles’ of feedback are offered.  Other phrases used to describe the FutureLearn approach include vicarious learning, conversational learning and orchestrated collaboration. 

In terms of technology, Moodle was not used due to the sheer number of potential users.  The architecture of Moodle, it was argued, just wouldn’t be able to cope or scale.  Another interesting point was that the software platform was developed using an agile process and has been designed for desktop computers, tablets and smartphones. 

Barney Graner, from the University of London, described a MOOC that was delivered within Coursera (Coursera website).  I have to confess to taking two different Coursera courses, so this presentation was of immediate interest (although I found that the content was very good, I didn’t manage to complete either of them due to time pressures).  The course that Barney spoke of was 6 weeks long, and required between 5 and 10 hours of study per week.  All in all, there were 212 thousand students registered and 9% of those completed.  Interestingly, 70% were said to hold a higher degree and the majority were employed.  Another interesting point was that if the students paid a small fee to permit them to take something called a ‘signature track’, this apparently had a significant impact on retention statistics.

Matthew Yee-King from Goldsmiths gave a presentation entitled ‘metrics and systems for peer learning’.  In essence, Matthew spoke about how metrics can be used on different systems.  An important question that I’ve noted is, ‘how do we measure difference between systems?’ and ‘how do we measure if peer learning is working?’

The final presentation of the day, entitled ‘exploring and interacting with history on-line’ was by Andrew Payne, who was from the National Archive (National Archive education).  Andrew described a MOOC that focused on the use of archive materials in the classroom.  A tool called Blackboard Collaborate (Blackboard website) was used for on-line voice sessions, the same tool used by the Open University for many of their modules.

Towards the end of the day, during the start of a discussion period, I noted of a number of key issues for further investigation.  These included: pedagogy, strategy and technology.

Reflections

In some respects, this day was less about sharing hard research findings (since the MOOC is such a new phenomenon) but more about the sharing of practice and ‘war stories’.

Some messages were simple, such as, ‘it’s important to engineer for scale’.  Other points certainly require further investigation, such as, how best MOOCs might potentially help to reach those groups of people who could potentially benefit most from participating in study.  It’s interesting that such a large number of participants already have degree level qualifications.  You might argue that these participants are already experienced learners.

It was really interesting to hear that different MOOCs made use of different tools.  Although I’m more of an expert in technology than pedagogy, I feel that there is continuum between MOOCs (or on-line courses, in general) that offer an instructivist (or didactic) approach on one hand, and those that offer a constructivist approach on the other. Different software tools, of course, permit different pedagogies.   

Another (related) thought is that learners not only have to learn the subject that is the focus of a MOOC, but also learn the tool (or tools) through which the learning can be acquired.  When it comes to software (and those MOOCs that offer learners a range of different tools) my own view is that people use tools if they are sure that there is something in it for them, or the benefit of use outweighs the amount of investment that is extended in learning something.

In some respects, the evolution of a MOOC is an exercise in engineering as much as it is an exercise in mass education.  What I mean is that we’re creating tools that tell us about what is possible in terms of large scale on-line education.  Some tools and approaches will work, whereas other tools and approaches will not.  By collecting war stories and case studies (and speaking with the learners) we can begin to understand how to best create systems that work for the widest number of people, and how MOOCs can be used to augment and add to more ‘traditional’ forms of education.

One aspect that developers and designers of MOOCs need to be mindful of is the need for accessibility.  Designers of MOOCs need to consider this issue from the outset.  It’s important to provide media in different formats and create simple interfaces that enable all users to participate in on-line courses.  None of the presenters, as far as I recall, spoke about the importance of accessibility.  A high level of accessibility is connected to high levels of usability.

Just as I was finishing writing up this quick summary, I received an email, which was my daily ‘geek news’ summary.  I noticed an article which had an accompanying discussion.  It was entitled: Are High MOOC Failure Rates a Bug Or a Feature? (Slashdot).  For those who are interested in MOOCs, it’s worth a quick look.

Permalink Add your comment
Share post
Christopher Douce

Disability History Month 2013 Launch Event

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 27 Nov 2013, 18:29

It took me a good few minutes to find my way out of Westminster underground station.  When I did finally emerge to the surface, I found the houses of parliament towering above me.  After a minute or so gathering my bearings, I was on my way.  I roughly knew where I was going: past Westminster Abbey, then take a turning down one of the adjacent roads.  As per usual, I got to the site of the venue ridiculously early.   So early, in fact, that they organisers were still putting the chairs out (!)

The launch event for the 2013 disability history month was held on 19 November. I attended a similar event in 2011 (blog post), which I found thought provoking, but my diary conspired against me from attending last year.  There were a number of reasons to go along to this event: one is personal, and another is professional (a third reason could be considered to be political).

The day kicked off with Richard Raiser playing a clip from a recent BBC two part documentary which described how the lives of people with disabilities had changed.  I did manage to see the first episode, which was about the care system, but I didn’t get to see the second episode (and I had missed downloading it on iPlayer).  The point was simple: we’re on the telly, and we’ve a right to be there.

Speakers

Just like the last launch event, there were a number of speakers.  The first speaker of the day was Kevin Courney from the National Union of Teachers.  Representatives from unions featured heavily in the 2011 event, and this year was no exception.  Teachers are, of course, likely to encounter people with disabilities and they, of course, may have disabilities themselves.  Kevin drew our attention to some teaching resources that the unions had prepared for schools.

The second speaker of the day was Mike Oliver, who was introduced as a social model theorist.  By way of detail, the social model is a way of looking at disability where people disabled not by their so called impairments, but instead by the society in which they inhabit.  Mike touched upon history before speaking about themes such as choice, control and independent living.  Mike’s underlined the significance of the current economic challenges.

The third speaker was Jan Walmsley, formerly from The Open University (an institution that has now over ten thousand students with disabilities).  Jan is a part of the Social History of Learning Disability research group (a research group that I hadn't heard of before).  The group was established in 1994 and one of its objectives is to share memories and experiences by people and for people by publishing life stories. 

The two final speakers of the day were Jackie Downer and Kirsten Hearn.  Jackie described the importance of support workers and that technology can be a lifeline.  Kirsten gave an impassioned speech, emphasising the importance of rights, and echoing points earlier points by saying that it was liberating that it, ‘wasn’t me that was the problem, but the world’.

Plenary

One of the first points to be made was by Baroness Dame Campbell who emphasised the importance of political lobbying.  An audience member asked about the credibility of the social model, and whether we ought to be thinking in terms of a ‘post-social model’.  (The questioner mentioned the name of an academic called Tom Shakespeare).  This struck me as a difficult question to answer, and a quick internet search led me to a research paper (University of Leeds) that takes quite a bit of reading.  This question points us towards the growing discipline of disability studies.

Towards the end of the panel session, the issue of teaching (and teachers) was again returned to.  I seem to remember a reference to the learning resources that were mentioned during the start of the speeches.  The point for these were simple: there is a potential to ‘educate out’ discrimination, (or to normalise difference) at an early age.

An alternative perspective

The final speech of the day a speech wasn’t really a speech at all.  It was a stand-up comedy performance by comedienne Liz Carr.  I hadn’t seen Liz before, but I had heard of her work through a comedy group called Abnormally Funny People.  Unfortunately, I haven’t made too many notes during this part of the event, since I was laughing too much, but Liz did reference a recent challenge to the government’s bid to abolish the Independent Living Fund (BBC Website).  I also remember a startling gag about the right to work assessments.  This, to me, was the kind of comedy that cuts quickly to an issue and makes us think.

Reflection

There was a palpable difference between the 2011 event that I attended and this event.  The biggest difference, of course, reflects the change in the UK political landscape; there were many references to government cuts and the ways that the affect people with disabilities.  We were encouraged to reflect on history and the lessons that it offers us.  We also needed to be mindful of ‘what used to be’; stories of change, difference and individuality are important to remember and to keep.  One thing I felt was a steely will to retain rights and fight for new ones.

Permalink Add your comment
Share post
Christopher Douce

Media Training – Milton Keynes, 19 November 2013

Visible to anyone in the world

‘You want me to go on the radio, right?  And talk about my subject…?  You’ve got to be joking.  I’m not doing it!’   This imaginary conversation helped me to make up my mind to sign up for what turned out to be a really interesting (and fun) media training course.

For reasons that I won’t go into, I always thought that I wouldn’t ever be in a position to go anywhere near a microphone.  After having taken a couple of courses as a student with the Open University, I started to realise that I had always learnt a huge amount from the audio materials that accompany some modules.  Even if I wasn’t ever going to be on the radio as a subject matter expert, there might (one day) be a possibility that I may be asked to record some podcasts that might find their way into some module materials; signing up to a short half day media training course seemed like a very good idea (at the time).

This isn’t going to be one of my longer blog posts.  The course was pretty hands on.  We were asked to carry out two mock interviews; one was face to face (in a pretend studio), and another over the telephone.  Despite the clear emphasis on the practical, there was also a bit of theory that was worth remembering.

Firstly, it’s essential to come across as a person, i.e. don’t talk like a scientist.  If you do, you just end up sounding defensive (or like a politician) - scientists (or engineers), are different.  Let something about ‘you’ come across – sharing the personal is okay.  In fact, you should expect ‘the personal’.  We were also told to think of an interview in terms of having a cup of coffee with a friend.

During the session I learnt a couple of interesting phrases.  One of them was the phrase ‘news values’.  In retrospect, what makes a news story ‘news worthy’ is pretty obvious, but it’s a phrase that allows you to articulate what aspects of a story might be interesting (or relevant) to listeners.

One point that recurs is the subject of control, i.e. whether an interview is controlled by the interviewer or the interviewee.  We were clearly told that it is certainly okay to take the initiative.  It is important to answer a direct question; it’s okay to say ‘yes’ or ‘no’, for example, returning to the main subject or focus of the interview by using linking phrases.

What do we do if we’re asked about subjects or areas that are beyond our area of expertise?   In this case, we might say something like, ‘this is a complicated subject and for the purposes of this interview’.  There are many different audiences, and one audience is your academic peers.  Whilst it is important to acknowledge this group of listeners, it’s more important to consider the general listener.

Sometimes the stating the obvious can really help.  When it comes to language, avoid acronyms and using scientific or technical language that is specific to a subject (always consider the audience), and avoid language that is ambiguous (since you might come across as being evasive).

I’ve made a note of quote that was used during the day.  It was by Alexander Graham Bell: ‘Before anything else, preparation is the key to success’.  Again, the point of this is pretty obvious: have a think about what you’re going to say before you get into the studio.

I found the whole experience both tough and interesting in equal measure.  I continue to have no immediate plans to step foot into a radio studio.  I am, however, slightly more aware of how things might work if I were ever called upon to make a recording.

My take away points are: expect the personal, think of it as a chat over a coffee, don’t use complicated language, and you should be free to take control: the interviewer has chosen to speak to you about your subject – you’re the expert in that situation, not the interviewer.

Permalink Add your comment
Share post
Christopher Douce

London Associate Lecturer development day, London, November 2013

Visible to anyone in the world

The Open University is divided into a number of regions and twice a year the London region runs a staff development event for its associate lecturers who live in and close to our capital.  This blog post is a brief summary of an event that took place on Saturday 16 November 2013.  My own role during the day was quite a modest one (I was only required to do a couple of introductions).  This meant that I was able to wear my ‘tutor hat’ for much of the day.

Challenges for ESL students

ESL is, of course, a common abbreviation for ‘English as a second language’.  From time to time I’m asked what the university is able to do to help students who struggle with English.  There are a couple of schools of thought about this.  One school of thought is that English and writing skills should be embedded within modules (this is certainly the case within computing and engineering modules).  Another school of thought is that there should be a particular course or module that is dedicated to writing (which is the approach that the science faculty takes).  There are, of course, pros and cons with either approach.  The aim of this session was to offer tutors useful guidance about different resources and materials that could be shared with students.  It also aimed to help tutors chat about different challenges they have faced.

One skill that was considered to be important was the reading of papers, and a point was made that this is something that could be practiced.  Reading is, of course, a prelude to writing.  Although some people might argue that university level academic writing is something that is done only within the university (or academic) context, it can also be argued that learning how to write in an academic way can benefit learners in other ways, i.e. when it comes to writing for business and commerce, or the ability to distil evidence and construct cohesive arguments.

One question that was raised was, ‘how do you offer feedback in instances where students may struggle to read suggestions?’  This was a very good question, and sometimes interventions, or special sessions to help students are necessary.

Our discussions about writing led onto other discussions about plagiarism and academic conduct.  Plagiarism is, of course, a word that has very negative connotations.  In some cultures, using the words of an authority may be considered to be a mark of respect.  On the other hand, developing the ability to write in one’s own words is a really important part of distance learning; it’s both important and necessary for students to demonstrate how they are able to evaluate materials. 

The university has very clear policies about plagiarism and academic practice, and this is something that I’ve blogged about previously.  (Academic practice conference: day 1 summary, day 2 summary). From the tutor’s perspective, it isn’t an easy task to address these issues thoroughly and sensitively.  One thing that tutors could do is to run an activity (which exposes issues that relate to academic conduct).  Tutors (or module teams) could show how things should be done, and then tutors could facilitate a discussion using on-line forums, for example.

Another discussion that I’ve noted was the use of the ‘voice’.  Different modules may have a preference as to whether students can or should write in the first person.   One of the arguments about writing in the third person is that it allows other voices to be more clearly exposed.

During the session, we were all encouraged to do a bit of group work.  We were given a sample of writing and we were asked, ‘what resource would you choose to share with your students to try to help them with their writing skills?’  This was a fun activity and it emphasised that there is a lot of resources that both students and tutors can draw on.

To underline this point of resources, there were sets of study skills booklets that were available in the presentation room.  These had the titles:  Studying with the OU – UK learning approach, Reading and Taking Notes, Preparing Assignments and Thinking Critically.  If you’re interested, these can be downloaded from the Skills for Study website.

Developing resources and pedagogy for OU Live

I arrived at this afternoon session slightly late, since I was having too much fun chatting to colleagues.  OU Live is an asynchronous teaching and learning tool (which is a posh term to say that people can do things at the same time).  In essence, think ‘skype with a whiteboard’.  It allows tutors to run on-line sessions with groups of students, offering both audio and text-chat channels.  From my own experience, running OU Live can be pretty hard going, so I try to take every opportunity that I can (time permitting) to attend whatever training sessions the university offers.

This afternoon session was presented in two parts.  The first part was from the perspective of a science tutor (Catherine Halliwell), whereas the second part was from the perspective of a languages tutor.

Science perspective

I arrived in the session right at the moment when an important point was being made.  This was: ‘find a style of delivery that suits you’. It can be quite easy to use OU Live just to give ‘lectures’, but it is possible to use it to deliver dynamic interactive sessions.

One thing that tutors can do is to record their on-line sessions.  More students might use a recording of a session than there are students who are able to attend a live session.  One of the benefits of recordings is that they have the potential to become a very useful resource.  Tutor might, for example, refer students to sections of a recording when they start to revise for their exams.  Another thought is that you could explicitly refer to them when a tutor gives assignment feedback (guiding students to parts of a presentation where you have explained potentially difficulty concepts).

Catherine mentioned that her faculty had trialled the use of pairing tutors together to run single OU Live session.  Her module, a third level chemistry module, has 10 hours of tuition time.  Each session was shared; one tutor would take the lead, and the other would be a ‘wing man’.

Another aspect to OU Live pedagogy which can be easily overlooked is the importance of preparation.  Students can be asked to carry out certain activities before a session, such as completing one or more worksheets, for instance, to help to prepare students – or even performing observations, with the view to sharing data.

Catherine also spoke about some features that I had never used, but had been (slightly) aware of.  One of these features was the ‘file transfer’ facility, which could be used by the tutor to send students sets of ‘unseen questions’, perhaps in the form of a word document.  In some ways, this could be considered to be the electronic equivalent of giving everyone some handouts.  (I can also see that this would be especially useful during programming sessions, where tutors might hand out working copies of computer code to all participants).

We were given a number of very useful tips: make the first session as interactive as possible, and feel free to use a silly example.  Also, use things like voting, or drawing on a map.  Another thought is to turn the webcam on at the start so that the participants know who you are (you can turn it off after a few minutes, of course!)  Tutors should try their best to make their sessions friendly and fun.

There are a number of other points to bear in mind: some students can be reluctant to use the microphone, and this is okay.  Another approach (and one that I’ve heard of before) is to use OU Live as an informal drop-in session, where students are able to log in to have a chat with a tutor at a pre-arranged time.  It’s also important to take the time to look at a student’s profile to make sure whether there are any additional requirements that need to be taken into account.   Finally, because it’s possible to record a session, a tutor can always say, ‘I’m going to go through this bit quite quickly; because I’m recording this, you can always go back and play it back later if there’s anything that you miss’.

Languages perspective

The presentation from our language tutor was rather different.  We were given, quite literally, an A to Z tour of topics that relate to the use of OU Live, leaving us (and our facilitator), pretty breathless!

A couple of points that I’ve noted include the importance of developing routines and forcing habits (in terms of running sessions at the same time).  It’s also a good idea to send group emails, both before and after sessions (so students are aware of what is going to happen).  In terms of preparation, it’s a good idea to get on-line around half an hour before just to make sure that you don’t run across any technical problems or issues; having been confronted with the situation of Java software updates in the past this is very sound advice.

During the question and answer session at the end of the afternoon, the issue of the recording of day schools also cropped up again.  Our tutors were very pragmatic about this: recording of OU Live sessions should happen, since it allows the creation of resources that all students can use (especially those who could not attend any of the sessions).  It is therefore important to let all students know that recording is going to take place either before events, or at the start of an event.

Reflections

There’s always something to pick up from these events.   There were two main things that I gained from this session.  The first was the early discussions about language support consolidated what I already knew about the importance of academic conduct (and how the university procedures work).  Secondly, I picked up some tips about how to connect things together, i.e. connecting together assignment feedback with the use of OU Live recordings. 

The next event is to be held at the London School of Economics in March.   This event is likely to include a Mathematics Computing and Technology faculty specific session which will be held in the afternoon.  The fine detail hasn’t yet been decided on, but this too is also likely to be a good day.

Permalink
Share post
Christopher Douce

Gresham College: A history of computing in three parts

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 15 Oct 2019, 15:47

After a week and a half of continual exam and assignment marking, I was relieved to finally be able to turn my attention to other matters (and get out of my house).  I had an idle question: I wondered whether there were any professors or lecturers in London who shared an interest in the history of computing or technology.  Rather than trawling through university web pages (which was the first idea that crossed my mind), I decided to ask the internet, searching for the words, ‘history computing lecturer London’.

One name was clearly at the top of the list, but it was something else a bit lower down the search result that immediately attracted my attention.  It was a series of lectures entitled, ‘a history of computing in three parts’.  My first reactions were, ‘it’s probably too late’ and, ‘you’ve probably got to pay a lot of money to go along to this gig’.  All this computer history stuff that I’m interested in has to be folded into my day job which means that that it’s easier to justify time but a whole lot harder to justify expenses.

After reading the paragraph that described the event, I cast my eye back to the heading.  I realised that the date of the lecture was TODAY!  The very same day I had done my Google search, Thursday 31 October!  After a few more clicks I discovered that the event was also FREE!  Behold, it was a miracle!  I looked at my calendar; the lecture started at four in the afternoon and provided that I managed to sort out some admin stuff and have a meeting with a colleague, I would probably have enough time.

The only fly in the ointment was that it was all booked up; there were no tickets remaining.  Who knew that the history of computers was such a popular subject?  No matter.  I was looking reasonably smart – I would try to talk my way in.

Lecture 1: Pictures of computers

After a few false starts I managed to find my way to a place called Gresham College (website); navigating my way out of Chancery Lane tube proved to be quite tricky. It is only in retrospect that I realised that this was one of those places in London that I really ought to have known about.  I just know that people who I speak to about this event will chuckle, slap their thigh and say, ‘oh yes, Gresham College...’ and then will look at me as if I’m some kind of idiot if I said that I had visited there ‘by accident’.

I strode purposefully down a long alleyway and was confronted by a smartly dressed gentleman who obviously had an important role to play.  I began my attack: ‘I’m, erm, here for the lecture…’, and was swiftly gestured towards a flight of stairs without a word.   I felt deflated!  I was expecting to fight my way into the lecture!  I soon found myself in an anti-chamber filled with men (and women) in anoraks looking at a projector screen and noisily settled down to what was the first lecture by Martin Campbell-Kelly.

I joined the lecture at the point where people were being shown coloured photos of office equipment and pictures of steel filing cabinets.  The context was that computers are machines that allow us to process ever increasing amounts of data (and there’s a whole history of manual record keeping that we can easily overlook).  We were then told something about the history of the Rand Corporation followed by parts of the history of the computer company IBM.

On the subject of IBM, he mentioned someone called Eliot Noyes (Wikipedia).  Noyes was for IBM as Jonathan Ive (Wikipedia) is for Apple (if you’re into industrial design).  Martin mentioned that mainframe computers had a particular look; for a time there was a particular ‘design zeitgeist’.  I’ve made notes that Noyes used to look over catalogues from the Italian company Olivetti, and not only designed computers, but entire rooms.  We were shown photographs of various mock-ups. 

The creation of physical prototypes reminded me of some themes that are mentioned in a couple of design modules, either Design Essentials or Design for Engineers.  Martin also made reference to designer Norman Bel Geddes (Wikipedia).  He also showed us a whole host of other pictures of big machines, notably the ICL 2900 (Wikipedia) used in the Bankers’ Automated Clearing System (BACS).  (I have to confess being dragged into the depth of the Wikipedia page about that particular ICL computer.  Should I confess to such level of geekiness?  Probably not!)

Martin’s talk wasn’t really what I had expected but I found it pretty interesting (and it was a shame I missed the first quarter of it).  I was surprised by the detail that he provided about manual filing systems but I was also encouraged by the inclusion of information about designers.  The visual and industrial design aspect is an important part of computing history too.  Thinking back, one of my first computers had a very different aesthetic to the machines that I use today.  Function and fashion, combined with the wider perception of devices and machines are perspectives that are inexplicably linked.

After the lecture, it later dawned on me that I’ve actually read one of Martin’s books, ‘Computer: a history of the information machine’ which he co-authored with William Aspray.  It’s a pretty good read.  It covers a range of different strands; the pre-history, early electronic machines (such as the UNIVAC, which he touched on in his talk), before moving onto the emergence of the internet and software.  It’s tough to do everything but he has a good old go at it.

Lecture 2: Turing and his work

The second lecture of the day was by Professor Jonathan Bowen (website).  Jonathan talked about the life and work of Alan Turing (Wikipedia) and mentioned Alan Hodges’s scholarly biography, ‘the enigma of intelligence’. 

Jonathan spoke about three key areas of Turing’s work: his work that relates to the fundamentals of computer science, philosophical work relating to artificial intelligence and his later work on morphogenesis (which now has strong connections to the field of bioinformatics).  He mentioned his birth place, spoke about his PhD research which took place at Princeton University (with Alonzo Church being his doctoral supervisor), and also spoke about his work at Bletchley Park.  Other aspects of his life were touched on, such as his work in the National Physical Laboratory (NLP) in Teddington and his movement to the University of Manchester.  During his time in the NPL, he worked on the design of a computer which then became the Pilot Ace (Wikipedia).  When he was at Manchester, he was familiar with the Manchester Mark I computer (the world’s first stored program computer, and don’t let any American tell you otherwise).

What I liked about Jonathan’s talk was its breadth.  He covered many different aspects of Turing life in a very short space of time.  He also spoke of the ambiguity regarding his death, echoing what Hodges had written in his biography of Turing

At the end of his talk, we were directed to a set of web links that might be of interest to some.  Last year was the centenary of Turing’s birth, and there is a commemorative website that contains a whole host of different resources to celebrate this.  There is also a site that is maintained by his biographer, Alan Hodges (turing.org.uk).  Interestingly, we were also directed to an on-line archive of documents which can be accessed by computer scientists, historians or anyone else who might be interested.

Lecture 3: The grand narrative of the history of computing

The headline act of the night was Doron Swade.  I know of Doron’s work from the Science Museum where he headed up a project to construct a working version of Charles Babbage’s design for his Difference Engine number 2.  Babbage (for those who don’t know of him) is a Victorian inventor and raconteur whose lifelong quest was to build and design mechanical calculating machines.  During his life, he had a battle with his engineer, had the challenge of securing money for his ideas, travelled around Italy and hosted some famous parties (and did a whole lot more).

The title of Doran’s lecture was an intriguing and demanding one.  Could there really be a grand narrative about the history of computing?  If so, what elements or ingredients might it contain?  Doron told us that the history of computing is an emerging field and then posed a similar question: ‘what strings [the different] pieces together?’  He also reassured us that there was a clear narrative that appears to be emerging.

The narrative begins with methods for accounting and number systems, i.e. mechanisms to keep track of number.  We could consider the pre-history to comprise of artefacts such as tally sticks or physical devices that can be used to ‘relieve or replace mental calculation’.  This led to the emergence of mechanisms that used moving parts, such as an abacus and a slide rule.  The next ‘chapter’ would comprise of devices that embodied algorithms; their mechanisms carried out sequences or steps of calculations.  Here we have the work of Babbage and links to Hollerith (who was mentioned by Campbell-Kelly).

Doron then presented us with a challenge.  If we represent history in this way there is an implicit suggestion that there is a clear deterministic path from the past through to the present.  If I understand the point correctly, any narrative (or description of the past) is always going to be flawed, since there is so much more going on.  There could be situations in which nothing much happens.  A really interesting thought that Doron introduced was the idea of a ‘stored program’ being met with puzzlement and confusion, but this is an idea that distinctly defines what a computer is today.  (I haven’t made a word for word note of what Doron said, but this is something that has certainly stuck in my mind).

Another interesting point is that a serial narrative naturally excludes the parallel.  There is also an issue of reflexivity (to nick a posh word that I learnt from the social sciences); there is a relationship between history making machines and machines making history.  Linearity, it is argued, does a disservice.  One way to get over the challenge of linearity is to draw upon the stories of people.  These thoughts reminded me of a talk by Tilly Blyth, current keeper of technologies at the science museum, about the forthcoming ‘information age’ gallery.  Tilly also emphasised the importance of personal narratives and also cautioned about viewing history as a deterministic process.

One of the highlights of Doran’s talk was his ‘river diagram’ of the ‘history of computing’ (my ‘quotes’ at this point, since I don’t think I made a note of a ‘heading’).  Obviously, a picture is much better, but I’ll have a go at describing it succinctly. 

In essence, the grand narrative comprises of a bunch of different threads.  One thread that runs through it all is the history of calculation.  There is another thread about the history of communication.  In the middle, these threads are linked by ‘tributaries’ which relate to the subjects of automatic computation and information management.  These lead to another (current) thread of study which is entitled ‘electronic information age’.  I also made a note of a fabulous turn of phrase.  The current electronic information age emerged from the ‘fusion chamber of solid state physics’. Another bit of the diagram relates to different ways in which calculation or computation could be realised: mechanical, electromechanical or electronic. 

I also made a quick note of what were considered to be the core ideas in computing: mechanical processes, digital logic, algorithms, systems architecture, software and universality (I’m not sure what this means, though) and the internal stored program.  A narrative, it was argued, comes from a splicing together of different threads.

Returning to Babbage, Doran said that ‘[he] burst out of nowhere and confounds us with schemes that are unprecedented’; proposing mechanical calculating machines the size of rooms.  Doran also spoke about Ada Lovelace’s description of Babbage’s designs of his Analytical Engine, a machine that embodies many of the core ideas that are used in computing today: ‘a fetch execute cycle, transfer of memory form the processor, programmable, automatic execution, separation of program and memory’.

Doran ends with a question: ‘to what extent did this [Babbage’s work] influence modern computing?’  The answer is, ‘probably, not very much…’ (my quotes this time, rather than Doran’s), since many of Babbage’s discoveries and inventions were rediscovered and re-implemented as computing devices were realised in different forms, moving from the mechanical to the electrical.  Doran argued that perhaps because there is so much congruence between the different approaches, the ideas that have been rediscovered and re-implemented may well be really important and fundamental to the subject of computation.  To paraphrase from Doran’s book, Babbage isn’t so much a ‘great grandfather’ of computing, more of a ‘great uncle’.

Reflections

For me, Doron’s talk tied together aspects of the earlier talks.  Martin spoke about the history of information management and touched upon the electromechanical world of computing.  By describing the work of Turing, Jonathan spoke about and connected to the history of automatic computation.  One of the challenges that I’ve been grappling with is that there is so much history that is fundamentally interesting.  I’m interested in learning more, but it remains difficult to know which parts of a bigger picture to focus on. 

What I personally got from the day was a confirmation that my interest in related subjects such as communication technologies and the use, development and deployment of software (and algorithms) do indeed form an important piece of a ‘grand narrative’ in the history of computing and information technology.  Whilst I instinctively knew this to be true, Doran’s river diagram, for me, drew together different influences and connections in a very clear and obvious way.

Before heading home, I grabbed a brochure that had the title, ‘free public lectures’, vowing that I would have a good look  though it to see what else was going on.  After saying a few goodbyes to people I left the basement room and walked up a flight of stairs.  In the intervening hours, it had become dark; time had passed and I hadn’t really noticed.  When I reached the street I reached into by inside pocket for my smartphone to see if I had any messages.  A light was flashing.  I didn’t have any messages but I had a few alerts.  A theoretical Turing machine rendered into a physical device was alerting me to a comedy night that was to take place later on that week.  This was also a gentle reminder about how subtly technology had become entwined with my life.  Was I reliant on this little device?  That was a whole other question.

When I was heading home I asked myself, ‘how come I never knew this Gresham college place existed?’  Perhaps it is only one of those places that you hear about if you’re ‘in the know’.  London, for me, is gradually revealing some of its secrets.

Permalink Add your comment
Share post
Christopher Douce

Ada Lovelace Day: City University London, 15 October 2013

Visible to anyone in the world
Edited by Christopher Douce, Monday, 28 Oct 2013, 13:42

After a day of meetings and problem solving, I wandered down to the basement where my scooter was parked.  I had a rough idea of the route I had to follow; I needed to head south from Camden town, navigate around Kings Cross and onto the Pentonville Road and then pick up the A1 at Angel, and then try to find my way south.  Thanks to Google Streetview I had geekily rehersed some of the trickier intersections – but I still ended up going the wrong way.

The reason for my Tuesday evening visit to the City University was to attend an event that was a part of a wider programme of events called the Ada Lovelace day (Finding Ada).  A website describes it as: ‘an international celebration of the achievements of women in science, technology, engineering and maths (STEM)’.  Okay, so I’m not a woman, but I’m fundamentally interested in two related subjects: the availability and accessibility of education to everyone, and the history of computing – so, it seemed a pretty cool event to go down and support.

Panel discussion

The event kicked off with a panel discussion.  The panel was introduced by Connie St Louis from City University.  The panel was a great mix of discussants from different sectors: the university sector, commercial sector and public sector.  Each discussant had a different story as to why they found science, technology or computing a fascinating subject.  

Whilst the subject of ‘coding’ (or the creating of computer programs) took central stage, quite a lot of the discussion it was great to hear about photonic research (from Arti Agrawal) and Prim Smith’s journey from programmer through to senior manager.  I particularly liked her description about how software can play a very important role in the provision of services to the public sector.   Vikki Read, from Unruly media, said that ‘it was important to give everyone the opportunity [to code]’.

Coding demo

After the introductions and initial questions came to an end we were given a taste of what ‘coding’ actually was.  In reality, this meant that we were shown what a ‘for loop’ looked like in a language called M-script which is used in something called Matlab.  For those who don’t know anything about Matlab, it’s a very complicated piece of software (I’m not going to say much more than this!)  It’s something that is used by engineering professionals to tackle some really tough engineering problems. 

For me, there were two things that didn’t work quite so well in this section: if you’re going to introduce what coding was all about Matlab wouldn’t have been my personal choice, and secondly, the coding demo was carried out by a man (which didn’t really seem to be in keeping with the day).  This said, we did get to see what M-script code looked like.

Doing a livecoding demo that is compelling and engaging is always going to be tough.  You’ve got to provide effective and efficient instructions that, in effect, are very understandable that do something that is interesting.   It’s not an easy task, and coders (in my humble opinion) only get into ‘the zone’ of coding (to appreciate the beauty and elegance of software) after a lot of hard work.

The Matlab demo was followed by a video presentation (YouTube) from code.org (website) which opened with the quote, ‘everybody in this country should learn how to program a computer... because it teaches you how to think’ (which, I think, is a good point).  I remember a quote from the video which goes something like, ‘software is about humanity’.  By writing code and considering abstractions (and how best to describe problems and situations to a computer), we need to reflect about our problems.  We also perpetually interact and work with software, whether we choose to or not.  It could even be argued that although software and programming has its foundations in mathematics and sciences, it is a subject that requires a huge amount of creativity.

One of the panel members later made the point that to be a scientist requires you to apply and use a huge amount of imagination.  The same, of course, can be said about software.

Question and answer session

The question and answer session was quite short and I haven’t taken too many notes during this part of the evening.  One of the questions asked was, ‘how difficult is coding?’  This one is difficult to answer easily since it depends on a number of different factors: the language, the problem that you’re trying to solve, and the level of motivation that you might have to solve it.  One other point that I do remember is a story about how one of the members of the panel gained her first job as an energy manager.  The short version of her answer was: it doesn’t hurt to be direct.

Reflections

This event was all about outreach and its objective was to inform and inspire, and this is something that is very tough to do in an hour.

Lovelace is a beguiling figure.  Her story is one that is fascinating.  It is also fascinating because of not necessarily what is known about her, but also what is disputed.  You don’t have to dig too far into her story to read about rumours of horse racing, gambling, debts and family jewels.  This said, she was certainly way ahead of her time (as were Babbage’s attempts to build a computing machine), when she wrote about the way that machines could weave patterns with numbers.   Babbage is certainly indebted to her when she translated (and added to) Menabrea’s description of his idea of the analytical engine.

During this event I was expecting there to be stronger voices that more directly call for more women in science, technology and engineering subjects.  I can remember a distinct gender disparity from my own undergraduate days when I studied computer science and I can clearly see that this is continuing today when I drop into computing and engineering tutorials (but less so in design tutorials) to give our tutors a bit of moral support.  I’ll be the first to put my hand up and say that I don’t really understand the reasons why this should be the case.

To me, computing is not just cool, it is very cool.  In what other subject can you invent infinitely complex, interactive and unique universes out of nothing but numbers?  Not only is software the stuff of pure thought, but it is also a way to solve real-world problems (some of which were hinted at by one of the panel members).

Not only did I get lost getting to the City University, I also got lost trying to leave the building. After a couple of false starts, I finally made it to the exit and out into the cool autumn air.  Minutes later, I had fired up the scooters engine and practically oblivious to the fact that deep inside the machine was some software (in my scooter’s engine management system) that was helping to propel me on my journey home.

Permalink Add your comment
Share post
Christopher Douce

T217/8 module briefing

Visible to anyone in the world

Even though I’m based in the Computing and Communications department, I have to confess that I do a bit of moonlighting in the Engineering and Innovation department.  I don’t feel too guilty about doing this since there is a lot of cross over between some of the subjects.  One of the cross over subjects is design: computer systems need effective and efficient interfaces, and software systems need (obviously) to be designed (or engineered).  An important question is how we might set about creating different designs.  There are, of course, strong connections between the subjects of design and engineering too.

This blog post is from a set of notes that I made during a module briefing that I attended on 28 September.  Module briefings are events that happen whenever a new Open University module starts.  It’s an opportunity for the module team to meet all the associate lecturers who have been recruited and an opportunity for them to ask questions about how things are going to run.

The event on the 12 September introduced two new design modules: T217 Design Essentials and T218 Design for Engineers.   These two second level modules follow on from a first level introductory module, U101 Design Thinking.  Whereas U101 (as far as I understand things) helps students to start to think like a designer though engendering a playful and reflective approach, T217 and T218 begin to focus on more practical and detailed issues that relate to products and how they might be manufactured.  The forthcoming T317 module, Innovation: Designing for Change, will move things along a bit further by considering wider issues, such as how design connects with and interacts with society.  (I’ve only heard snippets about this new module, so I had better not go on and say something that patently isn’t true!)

The briefing wasn’t held in the university but in a nearby conference centre.  Without having done a head count, I estimate that there were about thirty people in total.  This includes T217 and T218 tutors and their line managers (staff tutors) who will be helping within things behind the scenes.  We were, of course, joined by members of the T217/8 module team (Theo Zamenopoulos, Georgy Holden and Jeff Johnson) and our curriculum manager (Hannah Juma).

Module structure

Much of the following information is available on the module description, but I’m also including it here too (since it was explicitly covered during the briefing).  T217 comprises of five blocks.  These are: exploring designs and designing, designing for people, creative designs, embodying designs and design for making.  The module comprises of a set of printed books as well as a modelling workbook, which helps students to get to grips with sketching (an invaluable skill for communicating the designs of products).

There is, of course, a module website which leads everyone through the module materials a week at a time.  During the module, there are a set of skills development activities.  There are three types: design activities, assessment activities (which help to prepare students for the assignments), and workbook activities (which are all about building skills and confidence).

T217 is assessed by four tutor marked assignments and an end of module exam.  T218, on the other hand, is slightly different – it has three tutor marked assignments and is examined by a substantial piece of work, which is known as an end of module assessment.

Block summary

The first block is all about big ideas in design and how it relates to engineering, human and cultural perspectives.  It contains some ideas from the history of design and tries to get students sketching.  (The design of chairs features heavily in this first block, since they permit different aspects or perspectives on design to be exposed).  The message for the second block is that it is essential to consider the end user.  This second block also takes some first steps towards thinking about environmental issues.  

The third block is a bit different.  This block addresses theories of creativity and invention and how these are reflected in the practices of creative designers and engineers.  It exposes students to different techniques about how to help designers to become more creative.

Block four is about how to move from a broad concept design into a more detailed design that could be eventually manufactured.  It also continues to help students to think spatially and visually.  In this block there is also an emphasis on style and branding and how it relates to design.

The final block moves into even more detail.  It covers issues such as the choice of materials for prototyping and manufacturing, encouraging students to analyse existing artefacts.  I made a note of the terms, ‘materials, methods and emotions’ during the briefing.  The block also makes connections with open source projects and introduces students to maker and hacker communities.

Software

Although you might argue that the designer’s most powerful tool is a pencil, developments in information technology has led to the emergence of new and exciting design and illustration tools.  Software and information technology can also take us towards new ways of working.  One of the challenges with learning design at a distance is that students don’t have the opportunity to work within a studio space (where students might have an opportunity to wander around to look at the work that other designers are working on, allowing students to gain not only motivation but also inspiration).  Building on the experience gained in U101, the T217 module team are using some on-line social software called OpenDesignStudio.  This is a web application that allows students to share aspects of their work to other students.  Sharing is done through the use of images.  Students might take photographs of sketches or rough physical models.

Students are also encouraged to use of 3D drawing software, such as SketchUp (Wikipedia).  They can also use other (but different) 3D drawing software, allowing students to gain an appreciation of the differences between tools.  Other software includes a database about different materials and manufacturing methods (which students can used to inform their assignments), and a tool that can be used to create mind maps.  (Students who studied U101 will remember a software package called Compendium).

On-line and face to face tutorials

A part of the day was also spent discussing on-line and face to face tutorials.  Although most of the teaching (and learning) is performed through the module materials and the guidance that associate lecturers offer in response to tutor marked assignments, students can also attend a number of interactive tutorials.

The face to face tutorials (or day schools) are arranged by the regional centre that a tutor is affiliated with.  The on-line tutorials, on the other hand, are held in an ‘on-line OU live’ room.  This is a virtual space where tutors can speak to students through their computer.  During the briefing, tutors were briefly shown how to use and access these on-line rooms.  Towards the end of the day, there was an activity where different groups of tutors got together to plan a day school (which is OU parlance for a bigger multi-group event that is held usually on a Saturday) to help students become familiar with a module.

Reflections

I always enjoy going along to module briefings; they’re always fun and useful events, and this was no exception.  A couple of weeks after this event both T217 and T218 began their first presentations.  For me, three things stood out during this day.  The first is the extent to which the design team are building on the work that they carried out in the earlier first level module: U101 Design Thinking.  The second is that the T217 module (as well as T218) has a very clear and compelling structure which relate to very explicit themes within design.  The third is the way that software and technology has been embedded within the module.

A final thought was that I was easily able to connect aspects of T217 and T218 to the module that I’ve been a tutor on for a number of years: M364 Fundamentals of Interaction Design.  I could clearly see links between areas such as user centred design, accessibility and the importance of skills such as sketching (as a way to rapidly communicate aspects of a design to others).  This, to me, underlined the importance and the need for connections between different subjects and disciplines.

Although I started this blog post by confessing that I have been moonlighting in another department, this term shouldn’t be used in a derogatory or negative sense.  When it comes to sharing perspectives and gaining insight into what happens in a slightly different (but connected) subject, moonlighting should be positively encouraged.

Acknowledgements

Many thanks to Theo Zamenopoulos (T217 module chair), Georgy Holden, Jeff Johnson (T218 module chair) and Hannah Juma (curriculum manager for T217 and T218) for running the briefing.

Permalink Add your comment
Share post
Christopher Douce

TU100 My digital life: AL development event

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 8 Oct 2013, 12:15

The second TU100 development day for associate lecturers in London and the surrounding regions was held on Saturday 7 September in the London regional centre.  The overall purpose of the day was to give associate lecturers who tutor on TU100 an opportunity to share experiences and to gather some useful feedback about the module that I could pass onto the module team.  These days are often great fun since everyone is very much up for sharing and talking (and this day was no exception).  This blog post represents a quick summary of what happened (from my own perspective, of course).

I’m writing this post for a number of reasons.  The first reason is to remember what happened on 7 September (since my memory is somewhat fallible), and the second reason is to give those tutors who couldn’t attend a bit of a feel for some of the subjects were discussed.  The third reason is to try to encourage other tutors to come along to other events that we run in the region.

There were essentially three different parts to the day.  The first part was all about teaching programming and Sense.  The second was about issues relating to student retention (where we heard about a university initiative called Project Retain), and the third was a general ‘feedback (or feedforward) to the module team’ session.

Session 1 : Teaching programming and Sense

During the first session we were put into small groups and Leslie, one of our very experienced TU100 tutors, distributed a questionnaire to inspire discussion.  These had the headings: ‘how does TU100 teach programming?’, ‘how does TU100 teach Sense?’, ‘student contact hours’ and ‘marking’.  Since I’m not a TU100 tutor I didn’t contribute too much to the group discussions, but I did make some notes of some of the themes that had emerged.

It wasn’t too long before the subject of programming cropped up.  One of the comments I’ve made is that the module doesn’t contain too much about testing.  One other thought is that early on in the module it is a good idea to emphasise the importance of Sense, particularly the Sense programming guide.  Another thing that tutors could do is to emphasise the wealth of Scratch resources that are available from MIT, and that perhaps we should more explicitly brief students that Sense is an extension of Scratch.

We soon began to talk about the on-line sessions which are presented through Blackboard Collaborate (or OU Live, as the university calls it).  One of the challenges with using the OU Live software is that it takes time to hand over screen sharing control when tutors ask students to complete certain tasks. 

An interesting point is that OU Live might not only be useful for running tutorials.  Since it contains a facility to record sessions it can also be used to record how any application is used.  Tutors (or faculty staff) could use OU Live to make ‘video’ recordings to demonstrate some programming concepts.

One of the biggest challenges that tutor’s face is the marking of assignments.  Sometimes tutors come across some puzzling situations, i.e. if students submit work where a screenshot represents a correct functioning program, but the program that is submitted isn’t actually correct.  When it comes to correspondence tuition, one of the fundamental challenges is to get into the head of the student.  This led to the question of whether we might be able to record video clips to show how students could have created correct solutions.

Plenary

After around fifteen or twenty minutes of chatting, all groups were asked to report back.  This section is a quick summary of some of the key points that some of the groups mentioned. 

TU100 doesn't contain a section that is dedicated only to programming.  Instead, programming can found in different sections throughout the module.  One point mentioned by tutors was that whilst TU100 teaches coding it doesn’t say much about how to do the 'problem solving' part of programming.  Instead, students are required to spend time discovering how to program by exploring and playing with the Sense environment.

Aware of this issue, some TU100 London tutors have started to present the fundamentals of how to break apart problems into pieces that could then be used to create code (either in the face to face sessions, or on the on-line sessions).  The precursor to TU100, M150 contained some materials to introduce students to something called structured English.  This gave way to a debate about whether some additional material might be added to TU100, but the problem is that there are already lots of materials that students and tutors need to cover. 

The point is that the foundations (in terms of learning to program) are really important, especially for students who might potentially struggle with the fundamentals of programming.  One tutor said that some students never make it to the starting line on Sense and this kind of resources could be a bridge between high level thinking and programming.  Some of the fundamentals that could be covered (by tutors) include the basic constructs of programming, which includes sequences of instructions, selection, iteration, the use of variables and debugging.

One tutor said that ‘we need to emphasise that it is important that students have a go’ (so students gain an understanding of what the building blocks of software is all about).  Also, there is need for a Sense forum, something or some area that allows sharing of materials and ideas between students and tutors. 

One piece of advice to students should be, ‘go look at what people do with Scratch’.  Another comment was, ‘add a couple of YouTube type videos about program analysis’.  The interactive nature of programming does lend itself to the use of OU Live, via application sharing, but on-line asynchronous tutorials are always going to be difficult and it takes a skilled facilitator to use more sophisticated functions such as on-line break out rooms.

Another perspective was that it might help the students if there was slightly more signposting to different resources.  (I understand that this is something that the module team have been working on for the new presentation).

Contact hours, tutorials and day schools

Different regions do different things when it comes to on-line tutorials and day schools.  When it comes to on-line time, the London region has given tutors the opportunity to schedule and run individual sessions.  The south region runs join sessions, as does the south east region.

When it comes to the face to face sessions, all the London groups come together to form a series of big day schools with the intention of creating a critical mass of both students and tutors.  In other regions tutors run sessions with pair of tutors.  The differences can be down to geography, both in terms of the location of the students and the location of the tutors.  One other thought from my side is that it is also important to emphasise to all students that they are encouraged to go to any of the tutorials that they might find in the tutorial finder (so they can discover evening as well as weekend events).

Some tutors use materials that are created by the module team, whereas others create their own materials.  One example is the London region tutors creating materials in structured English, with a view to trying to ‘plug a gap’ in the module materials (regarding how students new to programming might set about splitting a program into different components).

Another approach that some tutors adopt to use their allocated on-line time is to run on-line drop in sessions via OU live.  The idea for this is that students could just pop into an on-line room to have a chat with a tutor if they had any questions.  I personally find this a really compelling way of making use of the on-line rooms, particularly when students might be wishing to chat about programming.  The breaks with the formality of a one-to-one conversation of the technology, but also allows participants to see what is being displayed on a shared whiteboard.

Working with OU Live

The first tip (for tutors) was, ‘remember to switch on your microphone’.  Another thought was, ‘can we make headsets compulsory please?’  The reason for this is simple: when students use the microphone and headset that is built into a laptop, a whole group of participants can be easily distracted by feedback, making communications a whole lot more difficult.

In some respects, participating in an OU Live session can be quite intimidating and one observation was that there are lots of students who don’t want to speak at all.  Sometimes some students prefer to use the text chat window rather than using the microphone, which can then make if quite difficult for the tutor to keep on top of everything (which is why some regions share OU Live sessions between tutors).

One point was that it is useful to ‘do something’ every 20 or so seconds.  This might be asking students questions, requiring them to respond with yes/no answers.  Another thought is to use a series of polls to assess understanding of certain concepts.  (One thing that I have personally learnt from my experience with the South East of England training is to poll students using the, ‘happy face’ button, i.e. by asking the students, ‘is everyone happy?, can you click on your happy face?’  When you regularly ask this, it helps to keep the student’s attention).

Marking of code

This section of the plenary discussion echoed an earlier point, that when it comes to communicating what needed to be done with complicated TMA questions (which involve programming), could the module team produce a video about how things should run, or have been constructed (using Sense)?

I’ve learnt that there are two different ways to add comments into Sense code.  One way is to use something called a comment window.  Another is to add some in-line comments.  I made a note of a debate about the use of different types of comments and that in previous assignments a TMA question asked students to add comments.  The consensus was that comments help; they help students to reflect on the code that is being written and help tutors to understand what has been submitted.

Project retain

An interlude between the first and the second session was presented by Maggie King, our associate dean for teaching and learning.  One of Maggie’s responsibilities has been to be a part of a university wide project called ‘project retain’.  

Project Retain is intended to increase the university’s retention (and progression) of students across different levels of study.  The project has given the university a number of recommendations, which include: offer a guide to key learning points and module materials, schedule and communicate real-time contact sessions during the first two weeks of a module (ideally through a letter), open module materials and web sites before the module starts, and make it clear when assignments are coming up (so our students are not surprised when they have to submit their assignments).  The first year of study, it was argued, is absolutely crucial.

Session 2 : Retention

Terry, one of our experienced TU100 tutors facilitated the second main session of the day, which was also about retention (which is an issue that affects student satisfaction scores, recruitment and funding). 

Terry introduced us to HEFCE performance indicators.  These include dimensions such as the national student survey and other aspects such as the measurement of research performance.   Terry also introduced the difference between retention and progression.  Progression is all about moving from one level of study to another.  In some circumstances students can defer, allow them to take a bit of time out from study and enabling them to pick up a module again at a later date.

One of the biggest changes in the university in the forthcoming couple of years will be the introduction of something called student support teams.  Since more and more students will be registering with the intention of studying for a particular qualification, student support teams will play an important role in helping students with their choices along a student pathway – it is hoped will positively impact on student retention.

Terry covered a wealth of materials, including sharing with us points from a national audit office report, drop out rates, how retention in UK HEIs compare with the retention in other countries, and how the university compares with others in the national student survey.  During his session Terry asked us to consider the causes of student drop out during different stages of study, such as pre-entry, induction, on-programme and movement to the next level.  In the university both tutors, student advisors and module teams all have an important role to play.  The final question of the day was, ‘how can the university support you in the task of improving retention in your tutor groups?’  This was an exceptionally very good question to ask and is something that I’m keen to pick up on and delve into when I have a bit more time.

Session 3 : Open Session

I have to confess that I haven’t taken too many notes about this final session mainly because we ran out of time!  Everyone was very willing to share experiences and opinions throughout the day, which was one of its fundamental objectives.

Reflections

One tutor made the comment: ‘you can make a full time job of teaching TU100’.   TU100 is, without a doubt, a very big module: there is a lot of material and there are a lot of demands on the tutor’s time.  What struck me about this day was the willingness of tutors to do their utmost to help their students along their TU100 journey and their willingness to share experiences with each other.  The event had lots of energy and there was a lot of positive talking going on, yielding some very good ideas.  From my own perspective, I certainly hope to be running a similar event next year.  I’ve already had a couple of thoughts about what we might do.

I have learnt quite a few things from this session.  I’ve learnt about the opinions that tutors have about certain aspects of the module and I’ll be happy to forward these directly to the module team.  It is also clearly apparent that some students struggle with programming and the idea of producing some video material to help to explain certain concepts might be something could be useful.

Acknowledgements

Many thanks to all our TU100 associate lecturers who kindly gave up their valuable time to attend this event on a Saturday. If any of the tutors who have attended would like to add further comments, please don’t hesitate to comment below.

Permalink 1 comment (latest comment by Jonathan Vernon, Tuesday, 2 July 2019, 11:32)
Share post
Christopher Douce

European Innovation Academy

Visible to anyone in the world

In July I went to something called the European Innovation Academy. The idea behind the academy was to get groups of students together with the intention of creating a product or solution to a problem.  (By product, think of ‘mobile app’ or digital service of some kind).  As a part of a three week programme, students were taught about what is meant by innovation, introduced to concepts such as user centred design and different business models, before being presented with some talks about how to further develop their ideas.  At the end of the third week, participants were encouraged to write a short pitch to sell their product, solution or service, to potential investors, with a view to securing further funding.

Making skills visible

A couple of months earlier, I went to a UK Higher Education Academy event (blog) that was all about how best to go about the teaching of programming to those students who want to learn how to develop software for mobile devices.  What struck me was the point that if students want to get ahead, a really good idea is to create some kind of product that could be sold through vendor app stores (such as Google Play, for instance).  The advantage of doing this is that you advertise your skills in a very direct way and can clearly describe what you’ve done and achieved on your CV.

A substantial part of the academy was all about creating something.  As far as I understand it, there was time on the programme to allow students to not only learn about different platforms and tools, but also time to try (as best as possible) to create some prototype software that could be demonstrated to others.  Creating an artefact, as far as I could see, was considered to be a really important aspect.

Taking software further

A number of years ago, I used to have a job as a professional software developer.  It was thinking back to these times that I asked myself a fundamental question: ‘what on earth could I potentially say to the participants to help them appreciate some of the challenges inherent in creating software systems and products?’  I’ll put my hand up and say that I’ve always had one foot more firmly in the technology side of things than the business side.

I struck on an idea to not only talk about software, but also some of the more human sides of software development.  Software is, of course, a creative product, and there are things that we can do (in terms of structuring things to help people work together) to get things done. The things that we choose to do, however, are fundamentally affected by the types of product that we’re creating.  Some products or solutions require us to use different methods.

So, what did I talk about?  I had three hours to fill!  Below is a quick summary of what I considered to be the highlights.  The participants might have different views.

Challenges

First of all, I asked the groups some questions to help them consider what they considered to be the most important or significant challenges that they felt they needed to address.  When you’re going head long into a development, I thought it might be useful to find a bit of time to step back and ask the participants about the problems that they were facing, and whether they might be able to share some advice about how to solve some of their problems and how to manage some of the risks that each project group might face.

Interaction design

Since the participants were creating prototypes, I talked a bit about the process of interaction design and the ideas of different types of prototypes (i.e. horizontal prototypes and vertical prototypes).  I also spoke about the necessity to consider the user, the task and the environment, since considering all these aspects are really important when considering the final usability of a system (and usability will fundamentally influence whether or not a product or idea is accepted).

Processes

You could argue that interaction design is all about process.  I also introduced the idea of software development processes, notably, agile development which emphasises regular and constant communication between both developers and stakeholders.  I made the fundamental point that constant communication is a necessity since software is an intangible product; the only way to make software real is to talk about it.  Agile methods facilitates that talking.

Testing

In some software development cultures (and each culture is slightly different), software testing can be an integral component, but it is a subject that can be very easily overlooked.  Software testing is a pretty big subject, covering a huge variety of different techniques and approaches.  When we move from the small to the large, we fundamentally need to make sure that things work as they should be (since if things go wrong, then our customers don’t get a good customer experience).  I spoke about two important aspects to testing (and highlighted a bunch of others).  These were: different types of usability testing, and test driven development.

Abstraction

Abstraction is, perhaps, one of the most important and fundamental concepts in computing.  An abstraction could be described as an essence of a concept which doesn’t contain any superfluous detail or ideas.  When our abstractions are right, our software becomes easy to work with.  Abstractions represent a really important way to manage complexity.  We need abstractions within our code because programmers can only deal with a limited amount of stuff and connections between parts of a program at any one time.

One approach to creating software is to create our code in different layers.  Software developers constantly use code libraries as well as consume data from other information sources.   When talking about abstractions I also introduced the idea of design patterns.  These represent templates of common solutions to coding (and software design) problems that have been shown to occur time and time again.   Coming back to the point of processes and the need for constant communication, if we can put a name against our various types of abstraction (which is something that the concept of a design pattern does for us), this can make the communication between developers a whole lot easier.

Version management

When you’re working with code things can get very complicated very quickly.  There’s multiple files, different versions of libraries, you might include a whole bunch of different graphics or change database structures or web services... and then the bugs start to creep in and give both you (and your customers) a whole set of headaches.

I felt that it was important to saying something about version control and configuration management.  When we’re in the zone of high productivity (when we’re at one with the problem and our tools), creating new products and services, we can quickly lose our own history in the path of continual change.  Version management systems (or whatever you choose to call them) enable some aspects of development history to be captured and saved.  One challenge that we need to be aware of is that the use of these tools requires discipline.

Technologies

To create any software of substance, you’ve got to use some technology that already exists.  If you’re creating apps, you’re going to use some kind of integrated development environment (which consists of programming languages, debuggers, code profilers, and a whole bunch of other goodies).  Another subject that I wanted to mention was that there is a whole set of other technologies that developers could use.

One really useful concept is the software framework.  In essence, frameworks can be considered as a set of high level abstractions that allow developer to more efficiently solve common problems.  A framework can allow you to work more quickly (and hopefully more efficiently) by building on the work of other developers.  Two challenges include: figuring out which framework to choose (and whether it really would help you or not), and then understanding how a framework might work.

Another broad set of technologies that developers might utilise is web services.  Web services can now be used to store data and host applications.  Rather than having to manage their own servers and systems, an app developer might be able to use services that have been developed and deployed by other companies.  The challenge lies in figuring out what they are and making choices between different possibilities.

Community

In terms of software, the word community can be interpreted and understood in a number of different ways.  There might be a user community or a developer community, for instance.  You might want to share information about an emerging product through blogs and direct interested users to these updates through Twitter updates.  My point was that community, whatever form this might take, is fundamentally important.  Although technology is a necessity, technology won’t develop, change or improve if there isn’t a community of users or developers that are keen on using or enhancing a system.

Another notion of community lies with the area of open source software.  I understand that during earlier parts of the academy, students were introduced to different types of business models.  Some business models work through the use, application and development of open source software.  In some situations, open sourcing a development might be a part of a wider strategy.  If so, then it is fundamentally important to consider how to support and nurture a community that makes use of any software (or service) that is made available to others.

A final connection to the notion of community that came to mind was the importance of partnerships. Creating effective software and services is something that requires a lot of specialist skills and expertise. I remember one story from a HEA event that I attended some time ago.  I remember being shown an example, a collaboration between a graphics artist and a programmer, that led to the development of a really nice product; an interesting and playable mobile game.  A fundamental point was that sometimes, the best work that we do is when we work with others. 

Reflections

A personal reflection is that putting together these series of talks seemed to take up quite a lot of time, but it was pretty good fun thinking about what to include and what not to include.  I asked myself a really simple question, which was, ‘if I was there, being a delegate as a part of this programme, what would I really want to know?’  In retrospect, I fear I might have perhaps crammed in too much material, perhaps covering too many ideas or too many technologies in what was a very short space of time.  On the other hand, I think this was the point of the programme: to introduce people to new concepts and ideas, and to allow those on the programme to be fundamentally challenged.

One thing that struck me was that some of the teams gave the impression that they needed more developers; more people who were able to use the software development environment to create new products.  If you’ve never seen an integrated development environment before, the learning curve is practically vertical - it takes time to appreciate their intricacies and idiosyncratic ways.   Three weeks is an impossibly short time to come up with a new innovative idea that actually does something if working with technology isn’t something that you do all the time.

Since I attended the programme during the third week, I wanted to positively tantalise the participants.  I wanted to say to them, ‘you know, all this tech that you’re playing with, and all these cool prototypes that you’re creating using tools that you’ve never used before? Well… this is only the beginning – there’s a whole bigger world of software tech out there ready for when your ideas and inventions become real.’  I hope I managed to expose some of that bigger world of software to some of them.

Permalink Add your comment
Share post
Christopher Douce

Google: celebrating the UK's computing heritage

Visible to anyone in the world
Edited by Christopher Douce, Monday, 28 Oct 2013, 13:37

On 1 July I attended an event at one of Google's offices in London to celebrate the UK's computing heritage.   The event was in five parts.  The first was a panel discussion about the very early days of the internet. This was followed by the screening of a short film, a presentation by Tilly Blyth about the Science Museum, some information about the national computing museum and the reconstruction of a computer called EDSAC, followed by a closing Q&A session.

I was immediately struck by the names of some of the speakers; people who were and continue to be fundamental pioneers of the internet.  During the event I made quite a few notes, only to later discover that the parts of the evening had been recorded and made available on YouTube.  So, if you're interested, do go and visit the links that are featured in this quick blog.  They're certainly worth a look.

The history of the internet

The first session, a panel discussion, comprised of Roger Scantlebury and Peter Wilkinson, from the National Physical Laboratory (NPL), Peter Kirstein (Wikipedia) from UCL, University of London, and Vint Cerf (Wikipedia) from Google.    You can view this really interesting discussion by going to the video recording on YouTube.

For sake of completeness, however, I'm also going to leave you with some of my edited notes which more or less reflect the bits that piqued my interest.  There were occasions where that I became so engrossed in the discussions that I forgot to take notes!  So they are, by necessity, very course and incomplete.  I recommend the video over than my notes.

 As soon as the discussion started, I started to remember stuff that I had read in various histories of the internet.  Donald Davies, who worked at NPL initiated a project that had intended to be national in scope - in some ways, similar to the Arpanet.    NPL has played an important role in the history of computing (and the internet).  Alan Turing moved to NPL to work on the ACE computer (Wikipedia), after spending time at Bletchley Park and working on voice scrambling systems.  This led to the development of the English Electric DEUCE computer (Wikipedia).

As an aside, I was really interested to learn that the NPL chose to make use of a Honeywell DPP-516  (Wikipedia) as the basis for some of their networking designs.  This happens to be the same machine that was used as an Internet Message Processor (Wikipedia) in the Arpanet project.  (It also turns out that the contractor that developed the IMP, BBN, visited NPL - interesting stuff!)

Peter Kirstein spoke about how he and how UCL became involved.  Politics, of course, proved to be a fundamental issue.  ARPANet was connected to a seismic array based in Norway called NORSAR which could be used to detect soviet nuclear tests.  Vint Cerf made some really interesting points - that the challenges were mostly bureaucratic ones rather than about technology.  Getting people to communicate is harder.  Like I said: the video is better than my notes!

LEO: Lyons Electronic Office

I've known of the LEO computer for a very long time, but it isn't a machine that I know too much about.  Google has sponsored the making of a film to celebrate the the LEO computer (YouTube), which is certainly.  I was very surprised to see a number of the participants in the film in the audience.  The underlined how recent this history is, and how phenomenally quickly technology continues to move.

Science Museum: Information age gallery

Tilly Blyth, from the Science Museum, London, spoke about the development of a new 'information age' gallery.  The aim of the gallery is to celebrate last two hundred years of communication and information technologies (I hope I've got this right!)  Tilly described its narrative approach; the museum has chosen twenty one different stories.  (I've made a note of a four)

The first is an exhibit of the last manual telephone exchange that was used in the country.  This physical artefact has the power to not only convey changes in technology but also the changes in work practices.  Another exhibit relates to the LEO computer, which I'm sure will be both interesting and enchanting in equal measure.

Some current technologies have their own interesting histories.  There's also going to be an exhibit about the global positioning system.  Commerce and information can now be more readily connected to physical locations.  I was reminded of these new apps where you can hail a taxi by pressing a button on your phone.

The final teaser was a mention of an exhibit that related to how technology was consumed and used in developing countries, such as Cameroon.  We can so easily get wrapped up in our own worldview that we can easily forget that information and communications technology has a global impact.  We were told that the museum was working with an anthropologist with a view to trying to understand how devices are used in different cultures.

I've taken a note of the phrase, 'stories of contrast'.  I'm looking forward to its opening.

EDSAC Reconstruction

David Hartley, the director of the National Museum of Computing at Bletchley Park spoke about the history of the museum.  David spoke about significant machines, such as the Harwell Dekatron computer and the Colossus reconstruction. He also touched upon the role of the British Computer Conservation Society emphasising its importance by saying that 'there is nothing so boring as a dead computer'.  David also mentioned that there were parallel cultures to the museum; one that related to the more traditional role of a museum and one that related to machine reconstruction (and preservation).

The second film of the day was entitled EDSAC - A cultural shift in computing (YouTube).  This video described a project to rebuild a historic computer.  It's certainly worth a look if you're interested.

Closing session

The opening question, to Vint, was 'did you have the notion that the internet would change the world?  What were you trying to achieve in those days?'  Vint spoke about a range of different things, and mentioned Douglas Englebart's mother of all demos  (YouTube) and other influences.   Vint also speaks about IPv6, space travel, the history of TCP/IP and ubiquitous computing.  The question and answer session has also been recorded (YouTube).   Some really great questions!

Reflections

One thing that struck me was how many people attended the event.  I was amazed!  Another thought is that it really did feel like a celebration.  I was also amazed to see some of the people who featured in the films that were screened sitting in the audience.  This reminded me of how close we are to our own history, and also how we are all wrapped up in it too. 

When we're in the middle of change we can't easily see the rate that it is happening.  Events such as this one helps us to step back and realise how far we've come in such a phenomenally short time.  A really good point was that whilst the technology is, in its own right, pretty interesting - it's the human structures and the politics that have to be negotiated to really allow things to be work.  Arguably, these represent the tougher challenges.

We have a reflexive relationship with technology.  We make technology by working with people.  When we've made something, technology has a potential to change us too.  An implicit challenge that each of us face is to understand and acknowledging the extent of these changes.

Permalink 1 comment (latest comment by Patricia Stammers, Thursday, 18 July 2013, 10:28)
Share post
Christopher Douce

HEA Workshop: Teaching and learning programming for mobile and tablet devices

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Mar 2014, 18:44

On 25 June 2013, I popped over to the London Metropolitan University to attend a HEA sponsored workshop that was all about how to best teach the programming of mobile devices.  My role there was to present something about an OU module that I help out on: TT284 Web Technologies, but I'll be saying a bit more about that in a little while.

Yanguo Jing, from London Met kicked off the day by talking about the twenty credit MSc module that he leads.  Yanguo said that his module is strongly connected with industry and various technology vendors and important themes are that of innovation and enterprise.  Importantly, students have an opportunity to carry out research themselves, create their own projects, develop their own apps and present their own findings.  One way that they do this is by making their own videos (which is also a great way to create evidence which can be contributed to assessments).

Yanguo also mentioned something called the Wow Agency. One of the important points of having a more direct connection with industry is that students get more immediate exposure to demands from industry.  This was thought provoking stuff.

Teach the future, not the past: Blackberry 10 development

Luca Sale and Simon Howard gave the first of two vendor presentations.  I'll put my hand up and say that I know next to nothing about developing applications for Blackberry devices.  In fact, I don't think I've ever used a Blackberry device other than to scroll through a message, when a friend briefly gave me their device to look at!

This presentation was all about developing for a new device, the Blackberry 10.  I have heard bits and pieces about this, but the new device has a totally new operating system called Z10.  Interestingly, it is based on an operating system called QNX (Wikipedia) (which I had vaguely heard of before).  Basically, it uses a microkernal architecture (which means it has a way to enforce stronger separation between the hardware and the main operating system that runs a device), it's pretty small, and is used in a range of different embedded systems.

Apparently, there are number of Software Development kits (SDKs) which means that it's possible to take an existing Android app and port it to the Blackberry (and have it deployed to users via the Blackberry equivalent of an app store).  The SDKs that were mentioned included Qt, HTML 5, Blackberry native, Adobe Air, and Java Android Runtime.

There was a quick live coding demo of how to create apps using the HTML 5 framework.  Other languages that might be used to craft code included Javascript (in conjunction with HTML 5), C++, and Java (as far as I understand).  At the end of the presentations, Nafeesa Dajda described the Blackberry Academic Programme (Blackberry).

Microsoft devices and services

Lee Stott continued the vendor specific part of the day by making a Microsoft themed presentation.  Microsoft, of course, has been investing significantly into the mobile devices space.  Not only do they have Windows phones, but they (of course) also have their Touch PCs.  So much so, that their new operating system (Windows 8) aims to create an experience specifically for tablet devices.

Lee talked about software eco systems and mentioned that services (as well as devices) are important too.   Services can also be thought of in terms of cloud services, and we were told that the cloud was becoming more and more important.  Since data is stored elsewhere, users have the potential to move between different devices and still have access to their documents and data, thus enhancing the user experience.  

One of the most interesting part of Lee's talk was where he spoke about the Microsoft Azure services.  I have to confess that it's been quite a while since I've been a Microsoft developer (in the intervening years I've done some PHP and coding using open-source frameworks), so it was useful to learn what the company has been up to and what services they are offering.

One of the challenges that I've always puzzled over is if you run your own tech company, how you might go about running and maintaining your own servers and databases.  System administration is a necessary, important and essential evil: getting to grips with real kit and devices is important, but is a detailed technical specialism in its own right.

If I've understood this correctly, Microsoft can host a virtual server which then can host your own database.  I'm also assuming that if you want, you can also write your own web services to do whatever magic stuff you need to do, which can then be consumed by users of mobile devices (or any other kind of client).  Customers of this service are then billed per minute of processor time.  I can see the benefits; server plant depreciates quickly and keeping them maintained is always going to cost money.  I find this approach to hosting and consuming data really interesting, especially since it offers an approach to devolve risk to a third party.  Of course, there are a number of competitors (Wikipedia) in the cloud services arena.  This whole area seems to be a new subject in its own right.

Just in case you're interested, here's a couple of links I've gathered up: the main Microsoft Faculty pages, the UK faculty connection blog, and a link to the Azure education blog.   Another link is DreamSpark which seems to be about giving students and institutions access to some of the latest tools and technologies.

TouchDevelop for Windows Mobile 8

The next talk was by David Renton, who is a lecturer in Computer Games Development.  David introduces a platform called TouchDevelop (Microsoft website) which used to be a Microsoft Research project. TouchDevelop is a programming language that has a graphical feel.  Programs that are created using it have an appearance of a textual language, but elements of code can be created using a series of menus (as far as I can understand).

The software that you can create using TouchDevelop can be run on different mobile devices.  In some respects, TouchDevelop occupies the same space as Scratch (MIT website).  David makes the point that it's difficult to create good games in Scratch.  I can (personally) neither confirm nor deny David's assertion, but my own view is that Scratch is a fun and useful environment which allows users to escape from the tyranny of syntax, whilst at the same gradually introducing users to different (and essential) programming constructs.

What was really interesting was that TouchDevelop contains cool stuff, such as a physics engine.  By providing such a facility, I can certainly see how and why such an environment could be particularly interesting and engaging.  Again, for those who are interested, David has a blog called Games4Learning.  A final interesting point is that TouchDevelop runs in a web browser, so will work on different platforms.

Shorter presentations: Lua and Corona, Digital Summer Camp

Ian Masters gave a short presentation entitled, 'teaching cross-platform mobile development using Lua and Corona'.  Corona (website) is a software development SDK and Lua (Wikipedia) is a programming language.  Like TouchDevelop, Ian demonstrated the use of an integral physics engine.  During the follow on discussion, there was quite a bit of talk about the Unity Engine (Wikipedia), which I've heard mentioned at a number of other HEA gaming and mobile events.

Martin Underwood talked about Digital Summer Camp which is an event where universities, colleges, industry vendors and other organisations have come together help to inspire young people who are interested in technology.  The Open University is also one of the 'digital skill leaders'.

iPhone game development at Robert Gordon University

Gordon Eccleston has been teaching the development of apps for quite some time.  He gave a short talk on what works and what hasn't worked.  Gordon introduced a new term: a flip classroom!  I hadn't heard this term before, but apparently this is where students do some preparatory work at home to prepare for tutorials (I think I've got that right!)

Gordon spoke about how things have changed.  These days students invariable have their own devices.  One difficulty is that vendors are always changing their devices, which means that lecturers face challenge in terms of an inability to control in their own environment.  This said Gordon does have access to some iPod Touch devices, allowing code created using the XCode platform (the environment used to create iOS applications) to real devices.

Gordon also mentioned that the school had access to the Unity3D engine.  This gave way to an interesting discussion about the difference between games programming versus games design courses.  I've also made a note that when it comes to submission of course work, submission to an apps store represents one judgement on quality.  When it comes to further assessment by the lecturer, one approach is to ask students to create a screen cast.  Assessment, I seem to recall, is a perpetual challenge (especially with the continual changes in technology), as is how to provide both teaching and resources through a web based environment.

Mobile apps development: enhancing student employability

Sally Smith and Scott McGowan, both from Edinburgh Napier University gave a short talk and presentation on the importance of employability skills.  Sally, who is the head of school, said that employers value relevant experience, want to see applicants who have a relevant degree, and have good soft skills. 

Faced with the necessity to demonstrate employability skills, it was argued that it would be useful if students could create something (say, an app, or some other related project) that can be both added to a CV and talked about in an interview.  Sally also talked about the importance of industrial experience and how her institution and school tackled this issue.

Teaching and assessment strategies in mobile development

David Glass teaches mobile development to second year undergraduates at the University of Ulster.  Students can create apps for the Android platform with Java using Eclipse.  Important parts of the module that I've noted down are subjects such as user interface design, data persistence and networking.  There is also a period of self-study where students are to gain an overview of mobile devices.

Challenges include teaching of programming and understanding what to assess and how.  The assessment approach that David mentions sounds really interesting.  Students are required to address legal, ethical and social issues.  They are then required to develop a basic app before moving on to creating something that is more advanced.  A basic app might be something such as a simple calculator or a measurement converter.  

Interestingly, a more advanced app might be something called a 'my run tracker' app.  David made the important point that the task of creating apps lends themselves to more open-ended assessment and group work.  Taking this approach has the potential to encourage creativity and help with motivation.

Design designers, don't program programmers

Lindsay Marshall, from the University of Newcastle, gave an impromptu talk that described his own ten credit postgraduate module and connected with many of the earlier debates.  At the end of his module, students are required to submit a portfolio.  Relating to the challenges of assessments, students were allowed to choose whatever platform they wanted, and choose whatever problem they wished to solve.  Students were encouraged to produce a design log and to present some kind of demonstration.  Moving forward this may take the form of a video presentation or recording.

Lindsay made the important point that it is also important to take the time to look at the code, as well as the final product.  Another component is the writing of a reflective essay, to describe what was learnt during the project.  Interestingly, there are no lab sessions.  Instead, Lindsay mentioned the importance of crit sessions, which is an important technique used in design.

What was really struck me from Lindsay's presentation was something that was also pretty obvious: that there are significant connections between the design disciplines and software development.  Both are fundamentally creative subjects, and both require people to understand the inherent nature and characteristics of problems.

Web technologies

And finally, it was my turn.  During my slot I spoke about a new Open University module called Web Technologies (TT284, Open University website), emphasising the point that there are so many important technologies that underpin the use of mobile technologies and devices. 

TT284 is interesting in a number of different ways.  Firstly, is uses a set of case studies of increasing size.  Students move from understanding how to create an app for a small club or society, through to understanding what might happen as a part of a software development company.  Students are then introduced to 'software in the large' (or sites that have incredibly high volumes), and what practical issues might need to be addressed.

When it comes to mobile technologies, drawing on a case study, students are asked to create an app for an Android device using MIT App Inventor (MIT website).  App Inventor is a graphical programming language, where code can be moved to real advices.  One of the challenges for any module that aims to either teach mobile technologies is the way that technology changes so quickly.  A really good aspect of this particular module is that it also addresses a good number of fundamental and really important standards and technologies.

Reflections

I learnt quite a lot from the vendor presentations and it's always useful to hear about the industrial perspective, particularly in a field that is moving so phenomenally quickly.  Whilst it's great for academics to learn what industry is getting up to (and you might argue that this is a thoroughly essential part of the job description), the presence of vendors links to an implicit battle for the hearts and mind for developers.  Users choose devices and technology that allows them to do cool stuff.  Cool stuff is created by developers.  Developers, in many cases, come from universities.  Taking this even further, developers are employed by industries who ultimately want people to be skilled in using particular software infrastructures and ecologies. 

Things have changed since I first started to go to these mobile technology events.  There are now many more devices than there were before.  The devices themselves have changed - they have more memory and power, and on the horizon there is a new generation of faster mobile networks.  By the same token, there are, of course, new tools, development environments, frameworks and libraries.  Educators are faced with the challenge of what to teach.  Some educators choose particular platforms, whereas others leave this decision entirely up to students.

When it comes to pedagogy, project and group work appears to be fundamentally important, particularly when it comes to developing employability skills and creating artefacts that can be presented to potential employers.  Keeping things open (in terms of either platforms or the problems that can be solved by the application of mobile technology) can present some challenges when it comes to assessment.  There seems to be some consensus in terms of asking students to produce videos of their working apps might be a good approach.

Making a decision about what platform to use or to develop for isn't an easy one.  When I was a student I was once told by a faculty member that 'you really need to know how to use all types of technology'.  His point was that you will more readily be able to move between one platform and another.  In doing so, you'll gain a degree of flexibility that will allow you to appreciate how things might be done in different ways.  This is a perspective that has stuck with me and one that is important since the platform that you're using now will eventually become obsolete in a couple of years' time.

When it comes to mobile technology, everyone is trying to figure what things we should be teaching and what the best approaches for teaching might be.  When we're dealing with an industry that is moving as quick as it is, these kind of events can be useful in terms of making connections and putting a marker in the ground whilst saying, 'this is how we do things today'. 

Permalink Add your comment
Share post
Christopher Douce

Making the history of computing relevant : Day 2

Visible to anyone in the world
Edited by Christopher Douce, Monday, 28 Oct 2013, 13:37

Session: Putting the history of computing into different contexts

The voice of the meachine: Tom Lean

Tom Lean from the British Library kicked off the second day with a presentation about a project that he is currently working on: An oral history of British Science (blog).  An important part of this project is about the history of computing.  A part of Tom’s role is to travel around the country to interview different people.  Each interview takes between 10-15 hours in length.  They are biographical; people are encouraged to talk about themselves, their environment, tools and procedures.

This lifestory approach to interviewing allows us to get a sense of the person themselves, their mannerisms and how they sound.  It allows us to have a more direct connection with the subject and those people who played a part in its development.  The longer interviews are edited down to highlights, which will then be made available through the British Library History of Science project website. I understand that researchers will be able to gain access to the entire interviews.

Tom gave us a taste of the interviews by showing us a clip of Ray Bird taking about the HEC1 computer (YouTube).  (For the interested, there’s also an Oral History of British Science YouTube channel).  The second clip was an interview of Mary Lee Berners-Lee (Wikipedia) who spoke about ‘what’s fun about programming’.

All in all, a great talk and a great initiative.  As an aside, I remember discovering another archive of oral histories of computing (University of Minnesota), which have been collected by the Babbage Institute.  Different interviews by different people (and institutions) are likely to explore and expose different issues.  Both archives are invaluable to present and future researchers.

Telling the long and beautiful (hi)story of automation: Marie d’Udekem-Gevers

Marie took us on a tour of devices that relate to the history of computing, offering us a slightly different perspective.   Computing can also be understood in terms of mechanisms, mechanisation and automation, which eventually takes us towards data processing.  We can also think of the history of computing in terms of generations, but there is also an important pre-history that we need to be aware of too. 

When we think of the pre-history of computing we might also consider mechanical and water clocks, the development of the Jacquard Loom (Wikipedia).   There is also the work of Pascal (who was mentioned earlier) and Babbage (whose trial machines are exhibited within the Science Museum).  Marie introduced a simple distinction: internal versus external representations (and memory).

The difference between the two is that we can easily (and obviously) see external representations (of information), captured within cards, or as notches on a rotating wheel.  Modern computers, of course, make use of hidden internal representations.  The difference between internal and external connects to the notion of the immediately understandable and tangible versus the hidden and abstract nature of software.  This connects to a wider (and later) debate about what we can gain by exhibiting the more recent generation of computing devices.

Competing histories of the internet: Christopher Leslie

Christopher Leslie (PPY homepage) teaches the history of the internet technology at the Polytechnic Institute of New York.  During his talk, Christopher mentioned a couple of books – one that I have read, and another one that I had never heard of before.  The first is called, ‘where wizards stay up late’ by Hafner and Lyon.  The second was called ‘NERDS: a brief history of the internet’.  (There are, of course, a number of other books about the history of the internet, such as one called ‘A brief history of the future’ by a former Open University colleague). 

A couple of comments that Chris made echoed some that had been made during the previous day; that it is very easy to take a determinist view of the history of technology; that developments occur gradually and in a number of determined steps.  When it comes to the history of the internet, there have been a number of different systems and innovations, emerging from different countries and locations. One interesting note that I made was the development occurs through a series of transitions, that technology is moved from one context to another.

Chris mentioned the work of Donald Davies at the National Physical Laboratory, Teddington, and an important Association of Computing Machinery conference in 1967 where two people who had never met each other presented very similar ideas.  In fact, I’m read that the word ‘packet’ (as in the phrase ‘internet packet’) comes from Davies’s work, whereas the protocols that make the internet work come from the work in the US (of course, I’m impossibly simplifying a whole swathe of really important history and technical stuff here!)  Chris also mentioned the French network Cyclades (Wikipedia) which has also influenced the development of ‘the internet’.

I’ve also made a note of his point that the connections between people and communities are really important.  Although defence funding was necessarily important, it is the connections between people and a culture of openness that exists within an academic community that helps developments to occur.  Another really important point that I’ve made a note of is that we ‘need to fight determinism in the classroom!’  I totally agree. 

My ‘take away point’ from Christopher’s presentation was that things are a whole lot more complex than they really are; there isn’t one history – there are many.

Session: Games

I was initially surprised to see a session on games in this conference, but the reasons why (and the importance of its inclusion) soon became apparent.  This session resonated a lot with me, since I was once an avid player of games during the ‘cassette era’!  There is also an increasing awareness that is a whole history that relates to the use of computers in entertainment.

Games and gaming can also represent compelling museum exhibits; they can be potentially used to draw people in to other exhibits.  This is why this session also has the subtitle ‘games – and it’s potential as a Trojan horse’. 

The popular memory archive: Helen Stuckey

Helen Stuckey, who travelled all the way from Australia, talked about a project that was all about collecting and exhibiting player culture from the 1980s.  I never knew this, but apparently there was quite a unique gaming culture in Australia and many games were developed locally due to import restrictions. 

The popular memory archive is a web portal.  Gaming isn’t just restricted to the games as artefacts; there is a wider and richer picture of use and consumption that is important too.  The portal allows visitors to save or record player memories.  In the 1980s games were often the first way that people came into contact with computers (this was certainly my own experience).  I have my own memories of walking to a newsagent and agonising over which game to buy with my own pocket money.  This walk, and the action of loading the game into my Atari computer in my cramped bedroom could be considered as a part of my biography.

Other aspects of computing history include the history of production and the role of hobbies.  Helen showed us a logo of the ‘Melbourne House’ software company, which certainly remember from my teenage years.  At the time, it had never occurred to me that this was an Australian company.

One of the challenges lies with choosing what artefacts and issues to focus on.  Out of a potential 900 titles, 50 game titles were chosen.  Some of the themes that I’ve noted include businesses, rise of the bedroom coder, legal issues, and the role of the collector.

Fan sites, such as Hall of Light (a database of Commodore Amiga games) and Word of Spectrum also have an important role to play in terms of documenting history.  (I started to look into both of these sites, and quickly found hours of my life had disappeared!)

I found the idea of a web-based resource really interesting.  Just as we have citizen science projects, such as Galaxy Zoo, I can see that there is scope for participative, or citizen history sites.  When there are so many memories and products and experiences out there, crowdsourcing is undoubtedly a powerful approach.  I’m enthusiastic about old games, and after a quick search around on the web following Helen’s presentation, I can clearly see that I’m not alone.

Introduction of computer and video games in museums: Tiia Naskali

Tiia’s presentation was about a physical exhibition rather than a virtual one.  Tiia spoke about gaming from the Finnish perspective and the hobbyist era between 1980 and 1990.  (On reflection, this is an incredibly short period of time in which a whole lot happened). 

Connecting to some of the points that Helen mentioned, Tiia made the point that games are a part of life histories. They are important within popular culture and the work of that period can be shared and appreciated by a newer generations.

What struck me as really interesting was Tiia’s summary of different game exhibitions that had taken place across the world.  One of the most prominent was Game On which apparently began at the Barbican, London. 

Gaming exhibitions still will continue to have resonance today.  On the month of this conference, the latest generation of games consoles are receiving a lot of attention: the Xbox One (Wikipedia) and the Playstation 4 (Wikipedia).

This session led to questions relating to the challenges regarding digital preservation, i.e. whether we should be considering how to preserve digital worlds.  For those who are interested in this project, more information can be found by visiting a project website that also contains a link to a final report. Other points raised during the question and answer session related to the authenticity of gaming experience and the potential societal impact of the use of games, which is, of course, the subject of on-going research.

Session: The importance and challenges of working installations

Computer Conservation Society – Its story and experience: Roger Johnson

Roger Johnson introduced the Computer Conservation Society (society website).  It wasn’t an organisation that I had heard of before, but I’m so glad that I heard about it.  The society was the brain child of Doron Swade (Wikipedia), former curator of the science museum (who has written a cracking book about the trials and tribulations of building Babbage’s Difference Engine no 2).

The society is a joint venture with the Science Museum and the British Computer Society and currently has approximately 800 members.  It has a number of guiding principles.  Firstly, membership is open to all, and it is free.  It doesn’t own computers but has, instead, close links to museums.  It also has a small rescue fund.  This can be used to help preserve historically significant machines that might be at risk of being disposed. 

During Roger’s talk, I made a note of the phrase, ‘today is tomorrow’s history’.  Given that there is so much that is going on at the moment a challenge lies with understanding what should be captured. 

For those who are interested, the CCS also has its own newsletter, called Resurrection (CCS website).

Museums – what they can and should be doing : Charles Lindsey

Peter Onion, who works on the Elliott 803 (Wikipedia) at the National Museum of Computing (and probably does a whole range of other things too!) temporarily stepped in for Charles Lindsey (who was able to attend the question and answer session).

Peter, using Charles’s words spoke about the objectives of a museum.  Two objectives are to inform the public and to help serious researchers.  Peter argued that perhaps there is a third, which is to preserve (and to develop) the skills necessary for the maintenance and operation of the objects and to preserve the perspective of those who created them.

One really interesting (and important) point is that museums are about history, not fashion.  One question was whether computing history ended in 1980?  This echoed an earlier point that some modern computers can appear to be visually uninteresting; their mystery and complexity is hidden within integrated circuits.  Working (historic) machines have the potential to add and expose depth and may be able to more directly expose the details that make things work.   There is also the question of what stories we may tell, questions about what issues earlier engineers (and maintainers) may have faced, methods they used and tools they applied.

History, nostalgia and software: David Holdsworth

We all know that hardware without software is useless.  A laptop without an operating system or application software becomes a pointless and immutable mix of plastic, glass and electronics.   Software is the stuff of computing (you might almost call software its ‘oxygen’), but so much of it is lost.  One of the most obvious reasons is that software is inherently invisible, and increasingly so.  This raises the important question of how to go about preserving (and also potentially exhibiting) software.

David showed us an interesting couple of web pages; an implementation of the Algol-60 programming language (Wikipedia) for a KDF9 computer (Wikipedia) demonstration through a web page.  Those who know something about the history of programming languages, Algol is a really important language.  Think of it as a latin of programming languages; it’s not used much these days but you can see strong echoes of its design in programming languages of today, such as Java.  (Being more of a software guy than a hardware guy, I felt that more might have been said about the history of languages).

The fact that we can write programs using an old language through a web page is really cool.  Such an approach allows us to sample the past and get a feeling for how things used to work.  David argued (or I have noted down) that we should ideally be able to browse and analyse source text, see software working and sample user experience.  I agree with him.

When it comes to digital preservation, David made the point that we need to read the original media and save it to new media, to keep a byte stream and create software to manipulate and work with these byte stream.  Not only is the software important, but so is the documentation too.  One way to deal with the documentation challenge is to scan existing manuals.  Documentation, however, can be flawed and incomplete.  The best representation of how a machine worked is an emulator.  A well written emulator becomes a description of how hardware operates.

On the subject of emulators and software, I asked myself a thought experiment of ‘what kind of exhibit would I create if I wanted to present something about the history of software?’  Some random thoughts include: the presentation of a command-line interface (echoing the use of a teletype), followed by the use of DEC terminals.  This would then be followed with a hands-on emulation of a Xerox Alto, followed by another emulation of an Apple Lisa (perhaps even an actual machine).  This could then be followed with a really early version of Windows, and then concluding with a touch screen tablet interface (running either iOS or Android).  All these presentations got me thinking!

The Teenage Baby: Chris Burton

I visited the Museum of Science and Industry (MOSI website) when I was looking around Manchester before choosing to study Computer Science there as an undergraduate.  Chris’s presentation has underlined that a repeat visit there is now long overdue.

Manchester Small Scale Experimental Machine (SSEM), also known as the Manchester Baby (Wiki pedia)was designed by Williams, Kilburn and Tootill and is considered to be the first stored program computer in the world.  Chris gave a description of a programme to reconstruct a replica of this very first machine.

The reconstruction was completed in 1998.  Chris told a fascinating story of the role the machine had played within the museum.  It was a story of movement and construction, of relocation and restarting.  The SSEM has now been in operation for fifteen years and it is important to remember that the original machine only ran for only three.

Chris emphasised the very important role of volunteers.  A volunteer can act as a guide, introducing the different aspects of the machine to visitors.  Chris told us of a story of a volunteer who held aloft a Williams tube and said, ‘this is what a flash drive looks like in 1948... and it only holds a millionth of a gigabyte’, raising curiosity and grounding the past in the technology of the present.

Physical reconstructions not only embody history, but also they represent and echo some of the processes that occurred as a part of the development of a machine.  By creating the past, we can not only develop skills, but we can uncover challenges that the early designers and users faced.

Session: Reconstruction stories

Reconstruction of Konrad Zuse’s Z3 : Horst Zuse

One of the truths in the history of computing is that there were a number of parallel developments happening around the world at the same time.  In Britain there was the work at Bletchley Park, in the United States there was the work at University of Pennsylvania, and in Germany, there was the work of Konrad Zuse.

Horst Zuse, who made a presentation at this conference, is Konrad’s eldest son.  I have known about Zuse’s work for a long time, and heard that his very early machines were destroyed in World War II.  What I didn’t know was the extent of Zuse’s creativity and innovation.  His early machines, the Z1, 2 and 3 used binary floating point numbers.  Z3 can be considered to be one of the first functional programmable computers in the world. One of the differences between the Z3 and other early machines it made use of electromechanical relays.  Z3 apparently used two and a half thousand  of the them, with six hundred being used for the calculating unit.

In 2008 Horst proposed building a new version, or a reconstruction of the Z3.  The new machine could be used to teach the principles of computing (addressing the same issue that the computing devices of today are more difficult to understand).  This reconstruction, however, was to make use of modern telecommunication relays, but this doesn’t discount the challenge of creating such a machine.

Horst talked about the delivery of the relays, the racks in which they were housed, the construction of memory and some of the challenges regarding the input devices (if I remember correctly).  It was initially located in the Technical museum, Berlin, to accompany the Z1 reconstruction that took place between 1987 and 1989.  It’s final destination is likely to be the Konrad-zuse-museum in Hunfield (museum website).  The museum looks like a cool place to visit!

There were two surprises in store for me.  The first was that Zuse created a binary calculating engine whilst independently rediscovering some of the principles that had been previously discovered by George Boole.  Secondly, during the question and answer session, a delegate asked about something called Plankalkül (Wikipedia).  I had never heard of this before.  In essence, Zuse proposed the design of a programming language decades before it became practically possible.

EDSAC Replica Project : David Hartley

Every ‘first’ is qualified.  Zuse’s machine is considered to be the first programmable computer, the Manchester Baby could considered to be the first solid state computer, whereas EDSAC (Wikipedia) is considered to be the first computer that went into regular service with a specific intention of solving problems for its users.  I didn’t know this, but EDSAC is also attributed to have helped three Nobel Prize winners.

The EDSAC reconstruction (project  website) started in 2010, following a conversation with a co-founder of ARM (which designs the processors that are used in smartphones and a whole host of other devices).  The project aims to have a working machine by 2015.  As well as creating a machine, corollary objectives include the desire to create a new archive of related materials and resources and, importantly, to create expertise.  These objects connect nicely to points that Peter Onion made when he was talking about the role of museums; that the very act of rebuilding (or preservation) actively enables past skills, tools and techniques to be rediscovered (and new approaches to be reapplied).

The machine is to be housed at the National Museum of Computing at Bletchley Park.  It’s interesting that there will be two early machines with very different memory technologies: the use of a cathode ray tube, and mercury delay lines.  I understand that there is a connection with the Dollis Hill research centre somewhere along the way, but I don’t (yet) fully understand the details just yet.  This just underlines the point that there’s always lots more reading to do.

For those who are interested, there’s a YouTube clip about the EDSAC replica project.

The Harwell Dekatron Computer :  Kevin Murrell

The Dekatron computer, or WITCH (as it is affectionately known), strikes me as a bit of an odd ball – but a very interesting one!  It was designed for (or as a part of) the UK Atomic Energy Research Establishment, Harwell, Oxfordshire.  Kevin told us that it was relay controlled, but it has an electronic arithmetic and logic unit (the bit that does all the calculations).  It also makes use of something called Dekatron valves which serves as its memory.

After spending life at Harwell, it was then moved to Wolverhampton and Staffordshire Technical College (which then later became a university).  Because of its move and role in education, it remains, perhaps the oldest original working computer in the world.

More information about this interesting machine can be found though the following YouTube video: The reboot of the Harwell Dekatron/WITCH computer.  The Computer Conservation Society also have a page about the WITCH (CCS website)

Capturing, restoring and presenting IRIS : Ben Trethowan

IRIS is an abbreviation for Independent Radar Investigation System.  Its role was to collect radar signals to record movements of aircraft.  Should there have ever been a mid-air collision the data collected by IRIS could have been used to provide key evidence for any investigation.  IRIS was said to have been built in the 1970s and ran until 2008 where it was decommissioned, which is an astonishing length of time for a single system.

Ben gave us some information about the technology.  IRIS was based on a DEC PDP11 that had been heavily customised.  Apparently the operating system had been customised too.  When it comes to computer conservation, the march of time can have an impact.  One of the challenges that Ben faced was regarding magnetic tapes.  Over time, oxidisation can occur, which means that the metal layer that is used to store all the data was starting to separate from the plastic layer.  An important part of IRIS was the use of high capacity data cartridges.  These too had started to degrade.  Rubber parts used as a part of the tape drives (or the cartridges) were beginning to perish.

As far as I can remember it, the previous owners of IRIS contacted the computer history museum and asked if they would like it.  Ben then got involved with the project to move the machine to Bletchley Park, working very closely with the donor organisation.  In doing so, he gained a thorough understanding of the role of the machine and the context in which it was used.

What struck me about Ben’s presentation was that he presented what amounted to a ‘good practice’ guide for computer conservation.  Ben’s talk was very clear; it was very interesting to hear all about the ‘other stuff’ that technical curators or ‘machine keepers’ need to consider or take account of.  Whilst a machine is interesting in its own right, understanding the context of use and the sharing of hard won expertise is invaluable in terms understanding how a machine works, its design and its broader organisational and cultural significance.

I’ve made a note (during Ben’s talk) that a good relationship with a donor organisation is important.   It also struck me that good computer conservation isn’t just about dealing with the computer and its software.  A computer forms a part of relationships between groups of people.  As soon as a computer moves from its original context to a new one it can easily become disembodied.  Understanding the human structures as well as the technical structures strikes me as a dimension that museums always need to be mindful of.

Reflections

The conference ended with a short panel session.    I have to confess to being pretty mentally tired at the end of the two days and I didn't take in as much at this point as I would have liked!  This said, the conference was just the right length; a third day would have been too much for me!

This part of the blog is a set of random reflections - nothing too controversial; just a set of thoughts on what struck me the themes were.  I’m sure that different people would have come away with a different set of themes based on their own personal interests.

One of the key themes of the conference was (perhaps unsurprisingly) the role of museums in the history of computing.  There are some fundamental challenges regarding preservation when many aspects of computing (and computer use) are intangible.  There is also a question of which stories to present and how we might present them, and how to we make what is sometimes abstract become visible to try to make it understandable.  One approach, of course, is to use guides or interpreters to try to inspire visitors and help them to understand abstract ideas and principles.  Grounding the role of machines in terms of their application or their wider social context also strikes me as being very important too.

Reconstruction of old computers featured heavily and this was a surprise (but in retrospect, this was more due to my own unfamiliarity of what was happening in this sector than anything else).  Reconstruction is a process where the actions both generates and reaffirms knowledge.  It also strikes me that it is a fabulous way to go about conducting research into some of the early designs and sharing expertise.

Another theme relates to the role of history and its relevance.  A number of speakers say that the history of technology or computing isn’t taught a great deal.  Computer history certainly wasn’t taught on my undergraduate degree and this is a shame.   I was also struck by the assertion that subjects such as computing are viewed as ‘ahistorical’.  This said, you scratch the surface and there’s a whole host of rich, deep and fascinating stories. 

It also was a real delight to inadvertently discover that those that had a connection with the actual history of computing were able to come along to the conference.  What also struck me was a sense of community, especially amongst those who have an involvement with the Computer Conservation Society.

A final work on what I got (personally) got out of the conference.  One of my research interests relates to how ‘place’ played a role in the development of computing, i.e. what happened and where.  I also hope to travel to different places where these innovations have taken place.  This, for me, will be a catalyst for adventure and learning.  In fact, I’ve already taken a couple of journeys and hope to do many more in the coming years.

One thing that I’ve realised is that there is so much history on my doorstep.  During the conference I was chatting to a former colleague who I was amazed to discover had a direct and immediate connection with a computer called LEO (Wikipedia), which was arguably the world’s first commercial computer.  (There was the UNIVAC in America, but I would have to travel quite a way to visit the places where it was created).  I know hardly anything about the LEO.  I feel that a whole new journey of discovery is just about to begin.

Permalink Add your comment
Share post
Christopher Douce

Making the history of computing relevant : Day 1

Visible to anyone in the world

Ever since I was a kid I've been interested in the history of computers.  When I was aged ten or eleven I would try to buy an issue of a pretty serious hobbyist magazine using my pocket money every two weeks.  Each issue was a part of a series that would make two really heavy books.  (I couldn't afford to buy very many of the issues, of course... I didn't have enough pocket money!)

In these magazines I remember seeing these old black and white pictures of a machine called ENIAC and reading about very early computers such as the Manchester Baby and the work of Zuse in Berlin, Germany.   These old pictures and articles have always stuck in my mind.  The past, to me, was interesting.  It was, in some way, another world that was there to be be explored.

This is the first of a series of two blog posts of a conference I recently attended at the Science Museum, London, on the subject of the history of computing between 17 and 18 June 2013.  More information about this conference is available through the conference website where you can find copies of the papers and presentations.  Google have also posted a page about the conference on their Google Europe blog.

My attendance at the conference occurred as a result of a random chat with one of the organisers about an old computer company called Elliott which once had its headquarters not too far from where I live.  This sounds like a random conversation - and it certainly was!  But I'm very glad it happened.

What I hope to do with these blog posts is to (briefly) summarise each the presentations (this is something that I do for myself from time to time, to help me to remember what happened).  One disclaimer is that I'll be picking up on the things that I personally found of interest, and I obviously can't do justice to every excellent presentation.

This said, I do hope to provide some links to some of the resources that some of the speakers mentioned, which I hope will be useful to fellow delegates, researchers and students alike.  A final disclaimer is that I'm only going to mention the names of the presenters who gave each talk (even though there were many other contributors) and that there's also a strong possibility that I may well inadvertently misrepresent or misunderstand things.  If I have done this (and you find this blog), then please do correct me by making a comment below.

Opening

The event was opened by Tilly Blythe, Keeper of Technologies and Engineering at the Science Museum, Arthur Tatnall, chair of the IFIP (IFIP website) WP9.7 History of Computing group, and Lynette Webb from Google.  Tilly spoke about some of the objectives that relate both to the conference and to the Science Museum.  These include the need to understand the audience and attract their attention, the use of compelling and engaging stories and the importance of objects that can inspire awe and wonder.

Session: The importance of storytelling in museums

Exhibiting the on-line world: Marc Weber

The first formal presentation of the day was by Marc Weber, who did a great job.  One point that I've made a note of is that it is very easy to overlook the fact that technology has a rich and detailed history.  There is always a back story.

Marc introduced us all to the idea of a hierarchy of exhibitability.  I immediately grasped what he meant: some items (or ideas) can be immediately understood and appreciated, whereas others can be difficult to present and grasp.  Exhibits can range from the personal and visual to exhibits that aim to present abstract ideas.  A lot of computing can be, by its nature, pretty abstract.  One way to get over this is to present concepts and ideas using computer screens - but could we do better than presenting information on large glowing rectangles?  How could we exhibit networking, for example?

One approach is to display physical artefacts, such as an original Interface Message Processor (Wikipedia) alongside current devices such as Cisco routers.  The challenge of exposing and exhibiting the internet to visitor 'is like trying to display the wind'.  The question about creating an exhibit about the internet reminds me of how everything (in terms of ideas, as well as devices) is connected.  To understand the history of computing we also need to understand the history of other aspects of technology, such as the history of telecommunications, for instance.

Narrative in the History of Computing: Tilly Blyth

I can remember the first time I visited the Science Museum computing gallery.  There was an actor who played the role of Charles Babbage.  He actor walked up to me and started to enthusiastically talk about his work.  Since I was then a shy twelve year old, I was having none of it - I just wanted to look at the exhibits; I was mildly traumatised by the actor's enthusiasm and he left demoralised.  Not quite an indelible scar, but an interesting memory that reflects one really interest approach that museums can take to make their collections come alive.

Tilly spoke (amongst lots of other things) about different approaches to exhibitions.   One of the problems with the chronological approach, presenting a gradual (and natural) progression from the past to the present, is that it suggests a degree of inevitability, or technological determinism.  A challenge with this approach is that this doesn't take into account the wider social issues and circumstances that brought about technological innovation and development.  Another point is that innovation happens in fits and starts, and there are many dead ends.  It's also the case that people remember stories, and one way to help with this is that the stories of people are important.

Tilly also spoke about the current exhibition about Alan Turing that celebrates his contributions and life, whilst also exhibiting a number of related artefacts.  This story telling or biographical approach strikes me as one that is understandable and compelling.

I didn't know about this, but there is going to be a new Information Age gallery.  (You can learn more about this through Tilly's blog). The gallery will expose, examine and celebrate, subjects through the eyes of those that were affected.  It will cover key communication technologies such as cable, broadcast, satellite, web and cell (radio) technology.  According to my roughly scribbled notes, it will feature something about the first communications cable that went across the Atlantic and will feature oral histories and video presentations.

At the centre of the exhibition will be something called the Rugby Tuning coil which was once used for transmission of very low frequency signals to submarines.  Such an object can connect to important subjects such as information theory and transmission.  After seeing a photograph of the coil I can assert that it is a striking and arresting object.  It appears to be one of those artefacts that is beautiful in not only its physical construction, but also in the sense that its design embodies the principles of technology that it utilities.

I've made a note that Tilly mentioned that there will be a series of stories.  There will be stories about the first information machines, such as Tommy Flowers and his role developing the Collosus, and the development of the Lyons Electronic Office (LEO) which is considered to be the first commercial computer in the world.    I understand that there will be something about the birth of computer networks.  A third story relates to the global information space, and a fourth is about computers for users (and being a tutor on a human-computer interaction module, this is a subject close to my heart).

Tilly's talk emphasised that narratives can connect places, ideas and artefacts, through people.  When it comes to exhibitions and artefacts, a key objective is to creating resonance and wonder.  I, for one, am looking forward to visiting the new gallery when it is opened.

Making history relevant through education and experience: Arthur Tatnall

I seem to remember that Arthur began with some questions: 'why should we be interested?  What questions comes to mind when se see an old mainframe? What can we do to make artefacts relevant and important?  What difference did it make to people's lives at the time?'  These are all great questions.

Linking back to an earlier presentation, there are (of course), a number of different streams that are important, such as mathematics, technologies for automation and control, technologies for information processing, communication technologies.  Interestingly,  Arthur mentioned something called Actor-network theory (Wikipedia).  This was a theory that I hadn't heard of before, and having an interest in the social sciences, this is something that I'll be certainly taking the time to look at.  In essence, the theory seems to be about the interaction between people and things.

Arthur also introduces some really important issues, such as, how do we preserve software?  (This is a question which crops up a number of different times throughout this conference).  There is, of course, the question of how we might convey the importance and relevance of software to visitors.  One approach might be to make use of guides to make the exhibits come alive (as long as they don't scare away any of the visitors, of course!)

Session: Key collections and the future plans

Heinz Nixdorf MuseumsForum: Jochen Viehoff

I never knew this, but apparently the Heinz Nixdorf computer museum is one of the largest of its kind in the world.  We were told that the museum has a total of one and a half thousand objects.  These range from very early mechanical calculating machines, such as those designed by Pascal and Liebnitz and also include objects that relate to the early history of telecommunications and telegraphy, such as an early machine by Samuel Morse.

Exhibits include a reconstruction of a Hollerith machine (Wikipedia) (which is an important part of the story of the IBM computer company) and different mechanical constructions and representations of the theoretically important Turing machine (Wikipedia).

By the end of the presentation I felt that this was one museum that I would certainly like to visit.  The challenge (as emphasised by Jochen) is that it might be quite difficult to find as we were told that the town of Paderborn, where the museum is situated, is not easy to get to.  (I was later told that he was exaggerating!)

Computers' Collection at the Polytechnic Museum: Marina Smolevitskaya

I never knew that there were so many museums that were collecting computing related artefacts!  During one of the breaks, I later found out that there was a completely new computer museum opening in Cambridge (I look forward to learning more).  Marina, however, briefly talked about her work at the Polytechnical Museum (Wikipedia), Moscow, Russia.  The computing collection was founded in the 1960s and now consists of 800 objects and 2000 documents.

Session: Expanding the audience for computing history

The Case of Computing: Gauthier van den Hove

Students who learn mathematics and computing don't (it was stated) tend to learn much history.  This said, there are some exceptions - there are courses in the history of mathematics, and there are some lecturers (some of them who came to this conference) who teach the history of computing.

Gauthier drew our attention to the differences between historical disciplines, such as the humanities (where history plays an important and central role), and ahistorical disciplines, which could be considered as more technical subjects.  I'm not so sure whether things are as clear cut as this, but I understand the point that is being made.  I've also noted down that Gauthier says that one of the dangers is anachronism.  For example, it is very easy to view the past through the glasses or spectacles of the present; we can very readily take for granted what we know.  (This connects to the earlier points about technological determinism and that it is difficult to see the rich histories underpinning the technologies that we use on a day to day basis).

There are two really nice quotes that I've made a note of.  These are:  'one of the main tasks of a historian is to identify the main facts to help us to remember the past' and, 'the past is a source of inspiration for the present'.   Another thought regarding the role of a historian is that their role is about identifying stories too, and that everyone is situated within a unique historical context.  When we consider the past, we need to consider the present too (and the relationship that we have with it).

The Mundaneum: Delphine Jenart

Delphine Jenart introduced something that I had never heard of before: the Mundaneum (Wikipedia).    In some ways, the Mundaneum, which is strongly connected to the subject of documentation science, can be associated with more recent ideas, such as Vannevar Bush's famous article As we may think (Wikipedia).

The take away points that I took from Delphine's presentation was the importance of press coverage and exposure, which connects with the thought that there are many different ways to connect with a wider audience and emphasise relevance.  More information about this can be uncovered by visiting the Mundaneum website.

Resurrecting Ukraine's computing heritage: Lynette Webb and Marina Tarasova

I was about half way through my doctoral research in the late 1990s when I stumbled across a paper in the Communications of ACM (perhaps the most prestigious computing journal there is) that had absolutely nothing at all to do with my research.  It was a paper that really grabbed my attention.  It was all about the design and development of computers in the Soviet era.

One of the challenges that I faced as a research student was that there were so many different things that I found interesting.  I spent a day or so reading and re-reading the paper before deciding that I had better put this to one side and get on with my main research before I got carried away - but this reminded me of my long-running interest in the old and the historical.  The paper presented a perspective and a social history that was very different to the one that I had read about in the computer magazines that I used to buy as a school kid.  I remembered all these things during Lynette and Marina's presentation.

Lynette talked about the connection with Google, and how this led to interviews and newspaper articles.  Some important points (in terms of exposing a computing related subject to the media) included the use of stories, anecdotes, anniversaries, photos and videos - all help to create a compelling and interesting picture.  Also, for those who are interested, there's a website entitled History of Computing in Ukraine. It's pretty interactive and contains some cracking pictures.

Session: Spotlight on research projects

The Konrad Zuse internet archive project: Christian Burchard

Christian Burchard introduced the Konrad Zuse internet archive project.  Not only did Christian talk about the archive (and how researchers might use to explore and study documents), but he also told us about a number of other resources exhibits and resources.   He also mentioned the reconstruction of the Z1 machine and associated on-line resources, such as a way to view the different components of the machine, and a demonstration of how it works.

As an aside, I understand that the Science Museum is hoping to make their archive of Babbage documents available to anyone who might be interested.

The Monads project: Chris Avram

Innovation and developments in early computing occurred at many different places at the same time.  Universities played a significant role in shaping and developing early digital hardware and software.  It is, perhaps, little surprise that universities have become unexpected custodians of machine of the past.

Chris Avram spoke of the preservation of computing at Monash University,Australia, and treated us to a number of interesting anecdotes regarding the use of punched cards and paper clips.  He also introduced us to the Monads computer, which was developed in collaboration with partners in Germany.  This went some way to reminding me that each institution has its own technical history which needs to be cared for.

Session: Integrating history with computer science education

Using old computers for teaching computer science: Giocanni Cignoni

There is a very compelling argument that some old things are simpler and are therefore easier to understand.  Old computers and technology opens up a range of different opportunities when it comes to teaching.  Instead of being impossibly miniaturised, circuits that do essential things are exposed, allowing ideas and principles to be potentially more readily understood.

Giocanni told us about early Italian computers.  Just as each university has its own history, there is also a wider history that connects with and related to individual countries (and groups of countries).  Another aspect to computing education is that simulations of early systems can expose the detail about how they could be operated.  Giocanni told us about the HMR project (pdf copy of presentation).  A simulator could be used to emphasise the difficulties, but also enable the fundamentals and the inherent complexity of devices to become more tangible.

Is there a future in the Past: Chris Monk

Chris is learning co-ordinator at the national museum of computing at Bletchley Park, which isn't too far from the Open University campus.  Visitors from schools are very welcome to visit the museum.  Not only can visitors be fascinated by the various galleries and exhibits, but Chris also runs 'learning to program' or coding sessions on a cluster of BBC Model B (Wikipedia) computers.  I visited this learning space a couple of years ago, and it reminded me of a couple of classrooms in my old school.

Chris commented that some learners can become very enthusiastic about the programming activities and even go as far ask asking where they might be able to buy one of these old computers.  In such cases, students are directed to more modern resources, such as emulators.  A quick internet search (I couldn't resist...) reveals a wealth of resources.

The museum has seen an increase in visitor numbers in recent years.  An interesting point to note is that there is an apparent (and significant) gender imbalance, with boys outnumbering girls to a ratio of 30:1.  During Chris's talk, I've also made a note of a site (or a project) called Young Rewired State that aims to inspire the next generation of coders and developers.

In some respects, old machines or devices reflect the times in which they were built and used.  Chris asked the interesting question, which is: 'will the word computer still exist in ten years?', when devices are disappearing into our clothes and into our environment.

Apparently, computing pioneer Grace Hopper once said, 'computing without a past is just a subject, not  a science'.  A thought (or point) emerging from this session is that it is incredibly easy to get thoroughly absorbed into the here and now.

Bringing relevance to computing courses through history: John Impagliazzo

I've made the following notes during John's talk: history broadens outlook, it helps us to look beyond the machine and can help us to think critically.  History helps to make the discipline mature, yet it's only done on the fringe.  In which faculty should a historian of computing or technology sit?  Should it sit within the history or the computing department?

John also mentions the importance of corporate history.  Whilst a lot of the very early developments took place within universities (or organisations that are closely connected to universities in one way or another), more recent developments have obviously and undeniably taken place in the industrial sector.  An example of this might be the history of Control Data Corporation (Wikipedia).  (As a brief aside, John also mentioned the Charles Babbage Institute, which is a centre for the history of information technology at the University of Minnesota).

I've also made the note of the following question:  'are teachers of technology conversant with the history of the technology that they teach?'  His point is that we're much more able to remember a story than a logical argument (or a bunch of abstract ideas).  Knowing a bit of history is good for the teachers, which means that it's good for our students too.

Adapting, rather than re-inventing the wheel: Martha Crosby

The final presentation of the day was by Martha Crosby, who had travelled to the conference from the University of Hawaii, a university that has its own unique place in the history of computing and digital communications.  If you're interested in this aspect of computing history, the detail about ALOHANet (Wikipedia) is pretty interesting - it was something that kept me occupied as an undergrad.

Martha took us on a very quick tour of various milestones, whilst making the point that history adds to your toolbox in terms.  She touched on history of IBM, the development of the Harvard Mark 1, the ENIAC computer, the work by Zuse, and the Altair (one of the first personal computers).  Interestingly, Martha also touched upon the subject of programming languages, which has its own history that hasn't been discussed as much.

I've taken a note of a great quotation, which goes: 'the history of computing is the history of human kind's creativity and ingenuity which is why we should hold onto it forever' which I believe might have been attributed to Jason Scott (Blog).    (Searching the source of this quote led me to this very interesting software archive (Archive.org) - which also seems to be a repository of software).

A final point is that ideas in computing are very often adaptations of ideas that already exist.  Understanding the trajectory of their development and combination is one way to understand the present.

Evening event: Alan Turing's Life and Legacy

By the end of first day, my head was beginning to ache, big time.  It was a full on day, which took everyone to the pre-history of computing and back.  We even (briefly) went back as far as 100 BC, before returning (close) to the present day to the origins of the personal computer and the internet.

After an hours break, we found ourselves exploring a gallery in the Science Museum about the life of Alan Turing.  There were exhibits that I had never seen before, such as the ACE Computer (Wikipedia). 

 

Permalink Add your comment
Share post
Christopher Douce

South East of England Associate lecturer conference: Kent College

Visible to anyone in the world
Edited by Christopher Douce, Monday, 24 Mar 2014, 14:14

Twice a year Open University associate lecturers have an opportunity to attend regional development events.  These conferences offer tutors a number of different training sessions about a range of different topics, ranging from change in university policies, through to the best way to use technology.

Each event is different and has a slightly different character.  This blog is a really simple overview of an event that I recently attended at Kent College.  In fact, I think I remember visiting Kent College to attend my first ever tutorial, which was run by my then mentor, not long after starting as an associate lecturer.  I remember getting quite lost amongst a number of different buildings and being in quite a gloomy room.  Things have changed: Kent College was unrecognisable.  Old buildings had been demolished to make way for new modern ones.  This, however, wasn't the only surprise.

Teaching through drama

Not long after arriving, we were all gently ushered into a large theatre.  We could see a number of tables set out at the front and I immediately expected to endure a series of formal presentations about changes to the structure of the university, or an update about student registrations, for example.   Thankfully, I was disappointed. 

From stage left and right, actors suddenly appeared and started to scream and shout.  It immediately became apparent that we were all in the middle of a theatre production which was all about teaching and learning.  We all watched a short twenty minute play of a tutorial, in which we were presented with some fundamentally challenging situations.  The tutorial, needless to say, was a disaster.  Things didn't go at all well, and everyone seemed to be very unhappy.  Our hapless tutor was left in tears!

When the play had finished and we were collectively shocked by the trauma of it all, we were told that it would be restarted.  We were then told that we should 'jump in' and intervene to help correct the pedagogic disaster that we were all confronted with.  Every five or so minutes, colleagues put up their hands to indicate that they would like to take control of the wayward situation.  It was astonishing to watch for two different reasons.  Firstly, the willingness that people took on the situation, and secondly the extensive discussions that emerged from each of the interventions.

Towards the end of the modified (and much more measured) play, I could resist no longer.  I too put up my hand to take on the role of the hapless tutor 'Rosie'.  My role, in that instant, was about communicating the details surrounding an important part of university policy and ensuring that the student (played by an actor) had sufficient information to make a decision about what to do.   It was an experience that felt strangely empowering, and the debates that emerged from the intervention were very useful; you could backtrack and run through a tricky situation time and time again.  The extensive audience, sitting just a few meters away, were there to offer friendly situations.

If an outsider peered around the door and saw what was going on, it might be tempting to view all this activity as some form of strange self-reflective light entertainment.  My own view is very different: there is a big distance between talking about educational practice in the third person, i.e. discussing between ourselves what we might do, and actually going ahead and actually doing the things that could immediately make a difference.   A really nice aspect of the play was that all the students (as played by actors) were all very different.  I'm personally very happy that I'm not tutoring on the fictional module 'comparative studies'!  This first session of the AL development conference was entertaining, enjoyable, difficult and insightful all at the same time.

Sessions

After the theatre production, we (meaning: conference delegates) went to various parallel sessions.  I had opted for a session that was part about the students and part about gaining more familiarity with the various information systems that tutors have access to (through a page called TutorHome).  I've heard it said time again that the only constant in technology is change.  Since the OU makes extensive use of technology, the on-line portal that tutors use on a day to day basis is occasionally updated.  A face to face training session is an opportunity to get to know parts of our on-line world that we might not have otherwise discovered, and to chat with other tutors to understand more about the challenges that each of us face.

The second session that I attended was also very different.  Three research students from the University of Surrey presented some of their research on the subject of motivation in higher education.  There is, of course, quite a difference between the face to face study context and the Open University study context.  A presentation on methods and conclusions gave way to an extended (and quite useful) discussion on the notion of motivation.

One memory of this session is the question of how it might potentially move from being strategic learners (completing assignments just to gain credit for a module or degree), to motivation that is connected with a deep fascination and enthusiasm for a subject.  There are a number of factors at play: the importance of materials, the way in which support is given and the role that a tutor can play in terms of inspiring learners.

I made a note about the importance of feedback (in response to assessments that had been completed and returned).  A really important point was that negative feedback can be difficult to apply, especially if there is no guidance about what could be done to improve.  (This whole subject of feedback represents a tip of a much larger discussion, which I'm not going to write about in this blog).

In terms of inspiration, one useful tip that I took away from this final session was that the relevance and importance of a module if a module can be connected to debates, stories and discussions that can be found in the media.  Although this is something that is really simple (and obvious), it sometimes takes conferences such as these to remind us of the really important and useful things that we can do.

Final points

All in all, a fun day!  From my own personal perspective, I enjoyed all the sessions but I found the theatre session particularly thought provoking - not just in terms of the points that were covered, but also in terms of the approach that was used.

Since I have no idea who is going to be reading this particular blog post (not to mention all the others I've written!), I guess I'm primarily writing for other OU tutors who might accidentally discover these words.  If you are a tutor, my overriding message would be: 'do go along to your regional conferences if you can make it - they are really good fun!'

If you're a student with the university I guess my message is that there are many of us working behind the scenes.  We're always trying to do the best that we can to make sure that you're given the best possible learning experience.  Another point that I must emphasise is that the instances of interaction with tutors are really important and precious (for student and tutor alike).  So, if you're a student, my message is: 'do go along to any face to face tutorials or days schools that might be available as a part of your module - there is always going to be something that you'll be able to take away'.

Permalink Add your comment
Share post
Christopher Douce

Animal Computer Interaction : Seminar

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 4 Nov 2018, 11:09

As a part of my job I regularly visit the Open University campus in Milton Keynes.  On the 5 June, I managed to find some time to attend a seminar by my colleague Clara Mancini.  Over the last couple of years, I had heard that Clara had been doing some research into the subject of Animal-Computer Interaction but we had never really had the opportunity to chat about her work.  Her seminar was the perfect opportunity to learn more about the various ideas and projects she was working on.

After a short introduction, Clara mentioned a number of topics from human-computer interaction (or 'interaction design').  These included topics such as the use of ambient technology.  This could include the use of smart sensors that can be embedded into the fabric of buildings, for example, so their environmental conditions and properties can dynamically change. Other topics include the use of augmented reality.  This is where additional information is presented on top of a 'real' scene.  You might say that Google Glass is one product that can make good use of augmented reality.

Clara also spoke of the interaction design process (or cycle), where there is a loop of requirements gathering, designing and prototyping, followed by evaluation.  A key part of the process is that users are always involved.  ACI is very similar to HCI.  The biggest difference is the users.

History and context

It goes without saying that technology is being used and continues to be used to understand our natural world.  One area which is particularly interesting is that of conservation research, i.e. understanding how animals behave in their natural environment.  One approach to develop an understanding is to 'tag' animals with tracking devices.  This, of course, raises some fundamental challenges.  If a device is too obtrusive, it might disrupt how an animal interacts within its natural environment.

Another example of the application of technology is the use of computer driven lexigraphic applications (or tools) with great apes.  The aim of such research is to understand the ways that primates may understand language.  In conducting such research, we might then be able to gain an insight into how our own language has evolved or developed.

Products and systems could be designed that could potentially increase the quality of life for an animal.  Clara mentioned the development of automated milking machines.  Rather than herding cows to a single milking facility at a particular time, cows might instead go to robotic milking machines at times when it suits them.  An interesting effect of this is that such developments have the potential to upset the complex social hierarchies of herds.  Technology has consequences.

One important aspect of HCI or interaction design is the notion of user experience.  Usability is whether a product allows users to achieve their fundamental goals.  User experience, on the other hand, is about how people feel about a product or a design.  A number of different usability experience goals have emerged from HCI, such as whether a design is considered to be emotionally fulfilling or satisfying.  Interaction designers are able to directly ask users their opinions about a particular design.  When it comes to designing systems and devices for animals, asking opinions isn't an option.  Clara also made the point that in some cases, it's difficult for us humans to give an opinion.  In some senses by considering ACI, we force ourselves to take a careful look at our own view of interaction design.

Aims of ACI

Clara presented three objectives of ACI.   Firstly, ACI is about understanding the interaction and the relationship between animals and technology.  The second is that ACI is about designing computer technology to give animals a better life, to support them in their tasks and to facilitate or foster intra and inter species relationships.  The third is to inform development of a user-centred approach that can be used to best design technology intended for animals. 

Clara made the very clear point that ACI is not about conducting experiments with animals.  One important aspect of HCI is that researchers need to clearly consider the issues of ethics.  Participants in HCI research are required to give informed consent.  When it comes to ACI, gaining consent is not possible.  Instead, there is an understanding that the interests of participants should take precedence over the interests of science and society.

Projects

Clara described a system called Retriva (company website), where dogs can be tagged with collars which have a GPS tracking device.  Essentially, such a product allows a solution to the simple question of: 'if only I could find where my dog was using my iPhone'.  Interestingly, such a device has the potential to change the relational dynamics between dog owner and dog.  Clara gave an example where an owner might continually call the name of the dog whilst out walking.  The dog would then use the voice to locate where the owner was.  If a tracker device is used on a dog, an owner might be tempted less to call out (since he or she can see where the dog is on their tracking app).  Instead of the owner looking for the dog, the dog looks for the owner (since the dog is less reliant on hearing the owner's voice).

Dogs are, of course, used in extreme situations, such as searching for survivors following a natural disaster.  Technology might be used to monitor vital signs of a dog that enters into potentially dangerous areas.  Different parameters might be able to give handlers an indication of how stressed it might be.

As well as humanitarian uses, dogs can be used in medicine as 'medical detection dogs'.  I understand that some dogs can be trained to detect the presence of certain types of cancers.  From Clara's presentation I understand that the fundamental challenges include training dogs and attempting to understand the responses of dogs after samples have been given to them (since there is a risk of humans not understanding what the dog is communicating when their behavioural response to a sample is not as expected).

One project that was interesting is the possible ways in which technology might be used to potentially improve welfare.  One project, funded by the Dogs Trust, will investigate the use of ambient computing and interactive design to improve the welfare of kennelled dogs.  Some ideas might include the ways in which the animals might be able to control aspects of their own environment.  A more contented dog may give way to a more positive rehoming outcome.

Final points

Clara presents a question, which is, 'why should we care about all this stuff?'  Studying ACI has the potential to act as a mirror to our own HCI challenges.  It allows us to think outside of the human box and potentially consider different ways of thinking about (and solving) problems. 

A second reason connects back to an earlier example and relates to questions of sustainability.  Food production has significant costs in terms of energy, pollution and welfare.  By considering and applying technology, there is an opportunity to potentially reconceptualise and rethink aspects of agricultural systems.  A further reason relates to understanding about to go about making environments more accessible for people who share their lives with companion animals, i.e. dogs who may offer help with some everyday activities.

What I liked about Clara's seminar was its breadth and pace.  She delved into some recent history, connected with contemporary interaction design practice and then broadened the subject outwards to areas such as increasing prominence (welfare) and importance (sustainability).  There was a good mix of the practical (the challenges of creating devices that will not substantially affect how an animal interacts within their environment) and the philosophical.  The most important 'take away' point for me was that there is a potential to learn more by looking at things in a slightly different way. 

It was also interesting to learn about collaborations with people working in different universities and disciplines.  This, to me, underlined that the boundaries of what is considered to be 'computing' is continually changing as we understand the different ways in which technology can be used.

Acknowledgements:  Many thanks to Clara for commenting on an earlier part of this blog.  More information about Clara's work on Animal -Computer Interaction can be seen by viewing an Open University video clip (YouTube).

Permalink 2 comments (latest comment by Jackie Doorne, Friday, 19 July 2013, 14:13)
Share post
Christopher Douce

BCS Lecture: The Power of Abstraction

Visible to anyone in the world
Edited by Christopher Douce, Friday, 10 Aug 2018, 14:41

When I was a graduate student at the University of Manchester (or the bit of it that was once known as UMIST) I was once asked to show some potential computer science students around the campus.  At the end of the tour I ushered them to lecture which was intended to give the students a feel for what things would be like if they came to the university.

The lecture, given by one of the faculty, was all about the notion of abstraction.  We were told that this was a fundamental concept in computing.  In some respects, it felt less of a lecture about computing, but more of a lecture about philosophy.  I had never been to a lecture quite like it and it was one that really stuck in my mind.  When I left the lecture, I thought, 'why didn't I have this kind of lecture when I was an undergraduate?'  As an undergrad I had spent many a hour creating various kinds of computer programs without really being told that there was an essential and fundamental idea that underpinned what I was doing.

When I saw the British Computer Society (BCS) advertising a lecture that was about the 'power of abstraction', I knew that I had to try to make time to come along. The lecture, by Professor Barbara Liskov, was an annual BCS lecture (the Karen Spärck Jones lecture) that honours women in computing research.

All this sounds great, right?  But what, fundamentally, is abstraction?  An 'abstract' at the top of a formal research paper says, in essence, what it contains.  Abstraction, therefore, can be thought of as a process of creating a representation of something, and that something might well be a problem of some kind.  Admittedly, this sounds both confusing and vague...

Barbara began her lecture by stating that abstraction is the basis of how we implement computer software.  The real world is, fundamentally, a messy place.   Since computers are ultimately mathematical machines, we need a way to represent problems (using, ultimately, numbers) so that a computer can work with them.  As a part of her lecture, Barbara said that she was going to talk through some developments in the way that people (or computer programmers) could create and work with abstractions.  I was intrigued; this talk wasn't just about a history of programming languages, it was also a history of thought.

So, what history was covered?  We were immediately taken back to the 1970s.  This was a period in computing history where the term 'software crisis' gained currency. One of the reasons was that it was becoming increasingly apparent that creating complex software systems was a fundamentally difficult thing to do.  It was also apparent that projects were started, became excruciatingly late and then abandoned, costing astronomical amounts of money. (It might be argued that this still happens today, but that's a whole other debate which goes beyond this pretty short blog post).

One of the reasons why software is so fundamentally hard to create is that it is 'mind stuff'.  Software isn't like a physical artefact or product that we can see. The relationships between components can easily become incredibly complicated which can, in turn, make things unfeasibly difficult.  Humans, after all, have limited brain capacity to deal with complexity (so, it's important that we create tools and techniques that help us to manage this).

We were introduced to a number of important papers. The first paper was by Dijkstra, who wrote a letter to the Communications of the ACM entitled, 'Goto considered harmful'.  'Goto' is an instruction that can help to create very complicated (and unfathomable) software very quickly.  Barbara described the difficulty very clearly. One of the reasons why software is so hard is that there is a fundamental disconnect between how the program text might be read by programmers and how it might be processed or executed by a machine.  If we can create a program representation that tries to bridge the difference between the static (what is described should happen) and the dynamic (what actually happens when software does its stuff), then things would be a whole lot easier.

Another paper that was mentioned was Wirth's 'program development by stepwise refinement'. Wirth is famous for the design of two closely related languages: Pascal and Modula-2. It certainly is the case that it's possible to write software without the 'goto' instruction, but Barbara made the interesting point that it's also possible to write good, well-structured software in bad languages (providing that you're disciplined enough). The challenge is that we're always thinking about trade-offs (in terms of program performance and code economy), so we can easily be lured into doing clever things in incomprehensible ways.

Barbara spoke about the importance of modules whilst mentioning a paper by Parnas entitled, 'information distribution aspects of design methodology'. One of the great things about modules, other than that they can be used to group bits of code together, is that they enable the separation of the implementation and the interface.   This has reminded me of some stuff from my undergrad days and time spent in industry: modules are connected to the term 'cohesion'.  Cohesion is, simply, the idea that something should do only one thing.  A function that has one name and does two or more things (that are not suggested in its name) is a recipe for confusion and disaster.  But I fear I'm beginning to digress from the lecture and onto one of my 'coding hobbyhorses'.

Through a short mention of a language called Simula-67 (Wikipedia) we were then introduced to a paper by Liskov and Zilles entitled, 'programming with abstract data types'.  We were told that this paper represented a sketch of a programming language which eventually led to the creation of a language called CLU (Wikipedia), CLU being short for Clusters.

There is one question Barbara clearly answered, which is: why go to all the trouble of writing a programming language?  It's to understand whether an idea works in practice and to understand some of the barriers to performance.  Also, whenever a language designer describes a language in natural language there are always going to be some assumptions that the compiler writer must make. Only by going through the process of creating a working language are language designers able to 'smoke out' any potential problems.

Just diverting into programming language speak for a moment, CLU implemented static type checking, used a heap, and doesn't support concurrency, the goto statement or inheritance.  What it does implement is polymorphism (or the use of generics), iterators and exception handling.

Barbara also mentioned a very famous language called Smalltalk, developed by Alan Kay.  Different developments at different times and at different places have all influenced the current generation of programming languages.  Our current object-oriented languages enable programmers to define abstractions, or a representation of a problem in a way that wasn't possible during the earlier days of software.

Research directions

Barbara mentioned two research topics that continue to be of interest.  The first was the question of what might be the most appropriate design of a programming language for novices?  In various years, these have been BASIC (which introduced the dreaded Goto statement), Pascal, and more recently Java.  Challenges of creating a language that helps learners develop computational thinking skills (Wikipedia) include taking account of programming language design trade-offs, such as ease of use vs. expressive power, and readability vs. writeability, and how to best deal with modularity and encapsulation.

Another research subject is languages for massively parallel computers.  These days, PCs and tablets, more often than not, contain multiple processor cores (which means that they can, quite literally, be doing more than one calculation at once).  You might have up to four cores, but how might you best design a programming language that more efficiently allows you to define and solve problems when you might have hundreds of processors working at the same time?  This immediately took me back to my undergrad days when I had an opportunity to play with a language called Occam (Wikipedia).

There was one quote from Barbara's lecture that stood out (for me), and this was when she said, 'you don't get ideas by not working on things'. 

Reflections

I should say at the point that I haven't done Barbara's speech justice.  There were a whole lot of other issues and points that were mentioned but I haven't touched on.  I really enjoyed being taken on a journey that described how programming languages have changed.  I liked the way that the challenges of coding (and the challenge of using particular instructions) led to discussions about modules, abstract data types and then to, finally, object-oriented programming languages.

It's also possible to take a broader perspective to the notion of abstraction, one that has been facilitated by language design.  During Barbara's lecture, I was mindful of two related subjects that can be strongly connected to the notion of abstraction.  The first of these is the idea of design patterns.

Design patterns (Wikipedia) take their inspiration from architecture. Rather than design a new building from scratch every time you need to make one, why not buy a pre-existing design that has already solved some of the problems that you might potentially come up against?  There is a strong parallel with software: developers often have to solve very similar problems time and time again.  If we have a template to work from, we might arguably get things done more quickly and cheaply.

Developers can use patterns to gain inspiration about how to go about solving common problems.  By using well understood and defined patterns, the communication between programmers and developers can be enhanced since abstract concepts can be readily named; they permit short-cuts to understanding.

In some cases, patterns can be embedded into pre-existing code that can be used by developers to kick-start a development.  This can take the form of a framework, software code that solves well known problems that ultimately enables developers to get on and solve the problems that they really need to solve (as opposed to dealing with stuff such as reading and writing to databases).

Abstraction has come a long way in my own very short career as a developer. One of the biggest challenges that developers face is how to best break down a problem into structures that can be represented in a language that a machine can understand.  Another challenge lies with understanding the various tools that developers now have at their disposal to achieve this.

Note: The logo at the top of the blog is used to indicate that this blog relates to a BCS event and this post is not connected with the BCS in any other way. All mistakes and opinions are my own, rather than that of the OU or the BCS.

Permalink Add your comment
Share post
Christopher Douce

Journey: London to Lincoln

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 14 May 2017, 10:01

20130502_163018

Riding from London to Lincoln on a motorbike is a blast.  I decided to be sensible and set off after the rush hour but I just couldn't wait.  I edged out into the London traffic at nine in the morning and quickly realised that I had made a mistake.  After about half an hour of wrestling with traffic, I was on an A-road heading towards the London orbital motorway.  Fifteen minutes later, I was circumnavigating a large chunk of London and heading towards the M11; a route that I hadn't done before.

The reason for my trip up to Lincolnshire was to visit my parents.  It was the third time I did this trip via motorbike and on this occasion I decided that I wanted to go on a journey that I had promised to take ever since I started to learn more about the history of computing.  

Lincoln is a city that I know well.  I spent quite a lot of time there, staying at my parents house whilst I got my head down to spend many hours doing some computer programming for a research project I worked on a couple of years ago.  During this time I also gained my motorbike licence.  I used to spend hours riding to and from Lincoln, gaining some kind of perverse pleasure if I became snarled up in a traffic jam (since it gave me the opportunity to practice clutch control and feathering the back brake).  Gradually, some of the city's secrets revealed themselves to me; the links between the old and the new - the contrast between the imposing medieval cathedral and ancient castle juxtaposed against modern industrial units and trading estates.

The M11 was a dull but quick road.  Within a couple of hours I skirted part Cambridge, a city that I've been to before a number of times but barely know.  As I rode I made a mental note that I need to return.  When it comes to the history (and the future) of the computer, Cambridge is a fundamentally important place.  My objective, at that moment, was to get to Lincoln and leave Cambridge for another day.

The M11 soon became the A1 and within hardly any time at all, I discovered the exit I was looking for: Stamford.  A gentle ride through this pretty market town soon gave way to quieter roads, the kinds of roads that motorcyclists love; roads that are gently undulating and sweep from left to right.  Not only were they undulating, they were also fairly empty, there was no rain and very little wind: perfect.  Small towns and villages came and went, my destination becoming ever nearer.  All in all, the journey took about five hours, including two stops (one for fuel, another for coffee).

After two days of catching up with my parents, the time had come; I was going to take a short trip to explore some places I had read about, had ridden past and had never properly seen.  I donned my protective 'gubbins' and set off across the fens.  There is this glorious road between the village where I was staying and Lincoln.  It's dead straight, with wide distant fields on either side - you can see for miles.

My objective was to get to the heart of the city and park in a place where had seen bikers parking.  When I got to the city, I blundered my way through the one way system twice before I bagged a space, vacated by a departing Ducati.  My first objective was to figure out where Silver Street was.  I looked up a nearby street.  I had accidentally (or unconsciously) parked on Silver Street!  My next objective was to find number 34, the birth place of George Boole (1815-1864).

If you're a computer scientist or just a casual user of a spreadsheet or database you would have quite likely stumbled across his name.  The terms 'boolean expressions' or 'boolean conditions' have been, quite obviously, derived from his name (in the same way that the word algorithm can be traced back to the name of a Persian mathematician).  I have to admit that I've only just started to scratch the surface on the history of Boole.  George's father, John, was cobbler.  Apparently was somewhat distracted by other pursuits, particularly mathematics and science.

I walked the entire length of Silver Street to try to find number thirty-four but quickly became confused; the street numbers were few and far between.  There seemed to be no discernible pattern.  I adopted the age old tactic of 'appearing to be confused' and barrelled into the entrance of an estate agent.  'Excuse me, mate, is this number thirty-four?' I asked a smart looking man who was wearing a shirt and sporting a tie.  'This is number thirty-two... I have no idea where number thirty-four is, might be next door?'  I offered a smiley thank you and returned to the street.

'Hello... erm, is this number thirty-four?', 'Yes!' came the delighted reply from a nice lady who was sat at a computer.  Number thirty-four, like number thirty-two was an estate agency.  'I've found it!' I exclaimed.  I took a step back and cast my eyes around the office-like interior, as if I was looking for some kind of shrine to the great man.  Instead, I saw a photocopier. 

The nice lady was bewildered.  I explained that where she worked just happened to be the birthplace of a famous mathematician (which appeared to bewilder her even more).  I was tempted to explain my enthusiasm by started to talk about the importance of Boole and the history of the computer but I felt that it was neither the time nor the place since I obviously wasn't interested in buying a house.  Realising that my first quest was coming to an end, I began to feel that I was making a bit of a nuisance of myself.  Before I went, I asked for their business card (to gain proof that their estate agency really was number thirty-four).  Sure enough, I had found number thirty four Silver Street.

Boole invented something called Boolean Algebra and I know his work in terms of Boolean Logic and studied it college during my vocational course in computing.  He proposed a form of algebra that works with two states: one or zero, or true or false.  The reason why Boole's work became so important was that computers represent everything using numbers which are made of these two states  Sound, music, images, video, computer software, documents, instructions to turn on burglar alarms, pretty much anything you can imagine can ultimately, represented using just 'on' and 'off'.  Strings of these states form numbers: the bigger the number of 'bits' (which are, in essence, Boolean on-off states) the more numbers that can be stored and moved around in a computer.

But why use those two states?  The answer is pretty simple: it makes electronics simple.  By going with the simplest possible representation it's then possible to do ever increasingly complicated stuff with a high degree of reliability.  One day, I hope to write something about electronic machines that worked with the kinds of numbers that humans work with - but would require a much longer journey than the one I'm writing about.

I'm simplifying things terribly here (since I'm not a mathematician and I'm writing about subjects that are slightly outside of my area of expertise), but I think it's safe to say that Boole's work on logic is so fundamental that without it we wouldn't have computer processors or logic circuits.  Boole, ultimately, created the tools of thought that allowed us to work with logic states.  In software terms, an on-off state can be considered akin to an atom in the physical world.

Boole's birthplace wasn't the only place I wanted to visit.  After saying my goodbyes to the nice estate agent people, I had another quest: to go and visit the school that Boole founded.  I walked to the end of Silver Street, crossed a road, walked a bit, then got confused... and only then consulted my GPS enabled mobile phone.  Within minutes, I was walking up a steep flight of stairs towards Lincoln's medieval cathedral.  It stuck me that I had probably found a path that hadn't changed for a couple of hundred years; some of the steps had been visibly worn down over time.  Looking upwards, I could see the cathedral through a small archway in the distance.

20130502_164044

When I was at the top, standing in the shadow of the cathedral, I consulted my phone again and figured out where I needed to go.  I knew where I was.  I had ridden on it many times before on my bike training.  It's a road that runs from the bottom of the hill (where the industrial and retail part of the city), to the ancient part of the city.  The top bit can get a bit exciting, since it's quite a fast road and two lanes merge into one before taking a route past the cathedral.   Within moments, I had arrived at my second destination.  I peered through the railings at a lovely looking house and I soon found a plaque on the wall that indicated that I was in the right place.  Here's what it said: 'George Boole, father of modern algebra. Author of the laws of thought and first professor of mathematics at university college, Cork, was born in Lincoln and established an academy in this house c. 1840'.  Satisfied, I turned around and retraced my steps and returned to my bike. 

Five days later it was time to return to London.  I set off ridiculously early, hoping to avoid as much traffic as I could.  The ride though Lincolnshire was beautiful.  There were these moments where you could see where dew had touched the undulating roads that I could see in the distance; roads that appeared as ribbons of silver.  I was touched by not only the physicality of negotiating them, but awestruck by the light and the experience that the roads were presenting me.  By the time I had got to London, everyone was fully awake and the motorways that took me back to South East London were pretty solid.

I've now got some more work to do to answer a number of different questions: what was the time in which Boole living was really like?  Who else did Boole know?  What kind of work did he do after he left Lincolnshire?  How exactly did he influence other mathematicians and has he made other contributions to mathematics (with a view to understanding its connection with computing), other than the ones that I've already touched on?  Time, of course, is the challenge: there are so many other questions out there that are interesting!

I've also got some plans for the next journey. I'm going to stick around in South East London for a bit and then cross the river for another Babbage related adventure.  I'm going to be spending quite a lot of time in London before venturing further afield. 

Permalink 2 comments (latest comment by Chris Stanton, Wednesday, 22 July 2020, 18:18)
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 2509211