OU blog

Personal Blogs

Christopher Douce

Using the Kindle for research and studying

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Nov 2014, 15:03

I have to confess, that when it comes to some technologies, I am a bit of a laggard.  It was only very recently that decided to get to grips with understanding the mysterious world of eReaders.  I have two excuses: the first is that there’s just so much ‘tech’ to keep on top of, which means that it’s difficult to know what to do next (which is actually a pretty lame excuse), and secondly, I’ve always been a bit sceptical about the screen quality of eReaders.

About a month ago, I requested a new book from the University library to do some preparatory reading for a new course I’m involved with.  The library turned around my request pretty quickly, but they also sent me an email that suggested that I’ve got some figuring out to do.  It was: ‘we only supply that text book in eBook format’.  No dead tree variety?  No, apparently they don’t do that anymore.

Back at home, I searched around for a box that contained a discarded Christmas present that one of my relatives had received and had then given to me after a couple of months; it was an Amazon Kindle.  After figuring out how to give it some power, the first thing I did was connect it up to my Amazon account.  I was gradually finding my way into being a ‘contemporary reader’.

This blog might be useful for anyone who has to use these eReader devices for their studies.  It might also be useful for any of my colleagues who have to battle with the mixture of convenience and frustration that accompanies the use of eReaders. 

I say ‘eReaders’, what I actually mean is ‘Kindle’, for now.  And when I say ‘Kindle’, is actually the really old ones with keyboards and black and white screen, and not any those new-fangled colour models.

The first section is all about figuring out how to read a text book.  The second section is all about how to download Open University on-line materials to your device (so you can read it on the go).  Some of the OU courses are presented entirely on line.  Two examples of this are: TT284 Web technologies, and H810 Accessible on-line learning: supporting disabled students. I describe how you might (potentially) go about downloading an on-line course to your device, so you can get ahead with your studies.

The third part is a bit of useful fun.  I asked myself the question, ‘I wonder what books I can get hold of for free?’  The answer is, ‘actually, quite a few’.  In the final section I share a few tips about how to download books that are out of copyright.  I, for one, haven’t been a great reader of the classics (I’ve been too busy messing around with computers; another lame excuse), but there are loads that are clearly available.

Working with text books

Apparently, the OU has a website called Mobile Connections, which offers some guidance about the use of mobile devices (OU website) and pointers to mobile strategy documents.  This is all very well, but how do I get a text book onto my device.

After clicking around the university library and attempting to access the text book that I wanted to ‘take out’, I was presented with the following message: "Patrons using iPads, iPhones or Android devices can download and read EBL content via the free Bluefire reader app. "  Now, I don’t have an iPad or an iPhone, and I’ve explicitly made a decision not to read any textbooks on my Android phone simply because my eyes are not up to it.  I haven’t heard about the Bluefire app, but the Bluefire website may or not be useful.

Another part of the library message was that "Downloaded EBL ebooks can also be transferred to any portable ebook reader that supports Adobe Digital Editions (ADE). There's a list of these compatible devices on the ADE website"

I had never heard of Adobe Digital Editions before but I’ve managed to find an Adobe website that offers a bit of information.  I had a good look on the ‘compatible devices’ list and my Kindle device wasn’t listed, which was pretty frustrating (to put it mildly).

All this frustration highlighted a division between two different formats: one called ePub and another called mobi.  Apparently ePub is an open standard, whereas mobi is owned by Amazon.  I soon saw that you couldn’t put ePubs on my Kindle, which was a bit rubbish.

 I asked myself two inevitable questions: ‘is it possible to convert an ePub to a mobi, and if you can, how do you do it?’  Thankfully, the internet is a wonderful thing, and I soon found a product called Calibre (website).  Calibre is described as a ‘free and open source e-book library management application developed by users of e-books for users of e-books’.  It’s a tool that you can download onto your PC, put an ePub in one side, and get a Kindle mobi book out of the other (with a bit of clicking and messing around in between).

 One thing that Calibre can’t do is take account of DRM.  DRM, or digital rights management, is used to protect media from being copied between different devices (which is why you need software like the Amazon Digital Editions).  If your ePub is protected by DRM (or, someone has said that you can’t copy it), then you can’t convert from one format to another.

For sake of argument, let’s say you’ve got a freely available text book that is useful with your module.  How do you go about transferring it to your Kindle?  In my naivety, I thought I could use the ‘old school’ technique of plugging it into the USB port of my computer and dragging files around.  Unfortunately, due to local OU system policies, staff cannot to write data to external USB devices due to an information security management policy.   As soon as I connected up my Kindle, I was presented with a message that read, ‘do you want to encrypt your device?’  If you’re ever asked that question in response to any e-reader you have, say ‘no’ straight away.  Thankfully, I did have the foresight to say no, as otherwise my Kindle would have probably been rendered useless.

Since I was unable to transfer my mobi files directly from my PC to my Kindle, how should I do it?  The answer came from a colleague: you email the books or any files that you want to read through your device to your Kindle account.  When you’ve done this, and you turn on your Kindle, magic happens, your document is downloaded.  If you’re interested, Amazon have some helpful pages (Amazon website).

Working with OU resources

More and more OU resources are being made available in Kindle and ePub formats.  This, I believe, can only be described as a ‘very good thing’ since some of the OU books can be pretty bulky.  When you’re working with an eReader, you can sometimes put all your module materials on your device.  When I go to tutorials, I tend to bring all the OU books with me – but rather than carrying them, I have them all preloaded on a Kindle.  This said, I am a great fan of paper; you can do things on paper that you can’t do with electronic devices and visa-versa, i.e. you can search for a term in an eBook, and you can scribble in your books with different coloured pens (and stick things between pages).

Not long after starting to mess around with my Kindle I realised I could do exactly the same with the other module materials I need to work with from time to time. I quickly realised that there would be a problem: things would start to get pretty confusing if I had all the different eBooks in one place on my Kindle.  Thankfully, there is a concept of a category.

After emailing a load of different mobi books to my Kindle, I noticed that my ‘TT284 category’ (I thought it was a good idea to group resources based on module code) became quickly overloaded, and I noticed that the default display order was the order in which the books were downloaded in.  Although this was useful, I got myself into a bit of a muddle with the download sequence.  I soon realised that it’s possible to change the ordering according to the title which made for a really nice sequence of module materials.

I’ve now got categories for all of the different modules I have downloaded resources for: H810, TT284 and M364 Fundamentals of Interaction Design.  For M364, I have a mobi version of the assignment booklet, and PDF copies of the four blocks.  I don’t, however, have a copy of the set text. 

The M364 set text is huge, and it’s a real pain to carry around, and students have regularly asked whether there are electronic versions that they could download.  Unfortunately, publishers are only just beginning to catch up with the new ways in which institutions and students consume their materials.  For now, we’ve got to battle on with a mixture of paper text books and OU materials which can be provided in a digital format.

Free books!

After months of it being in a box on my shelf, I’ve finally figured out how to use my Kindle.  Now that it’s jam packed with learning resources and I’m getting used to its screen (which isn’t too bad), I started to think about how I might use it to read stuff ‘for fun’, i.e. using it to read novels and non-fiction.

I quickly remembered Project Gutenberg which was a project dedicated to digitising books that were out of copyright.  I took another quick look at this and discovered that they now had books in eBook format, which was great news.  A quick look around took me to an interesting page called the Best Books Ever Listings (Project Gutenberg) I also discovered all these different ‘bookshelves’ organised by topic.  I really recommend that you have a good look around.

Another really good source of free (or really cheap) books is Amazon.  Within minutes of looking around I found a number of classics that I had never read before.  I clicked on a ‘buy’ button, and these new books were delivered to my device.  (Plus, since an eBook doesn’t have a cover, you can download some particularly racy books and read them when you’re on the train and no one would be any the wiser…!)

And finally…

As I said earlier, it sometimes takes me a while to get on top of a technology; I used to be someone who always wanted to mess around with the latest technologies and gadgets.

I don’t really know why it’s taken me so long to get to grips with eReaders.  I’m someone who likes the feel and smell, and flexibility of physical books.  This said, I’ve come to see that eReaders can give learners a flexibility that they never had before; an ability to carry everything around easily, and the ability to search for terms and phrases.  When a lot of material has moved ‘on-line’, eReaders can help us to access content in a convenient way without being always tied to a computer.  I think this is a really good thing.

I’m someone who loves to make notes.  One thing that you can’t do (very easily) is make scribbly notes on eBook pages, but that is okay: I’ll just have to figure out some new study strategies.

The more that you look at something, the more you think about different possibilities.  Looking at the Kindle has caused me to ask myself a further question, which is: how might you create an eBook from scratch?

Permalink 2 comments (latest comment by Leticia Briscoe, Tuesday, 29 Oct 2019, 21:44)
Share post
Christopher Douce

Chris’s Exam tips – Part 2 : During the exam

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 8 May 2022, 14:41

This is the second in a series of two blog posts about revising and taking exams.  The last post was about exam revision tips.  This post is slightly different: it’s about how to tackle the exam on exam day.  Some of these might sound to be mind-bogglingly obvious, but sometimes it’s good to say stuff that is obvious.

1. Read the exam paper

This first tip is certainly one of those ‘school of the blindingly obvious’ tips, but it’s also one that is blindingly important, especially if you’re going to use some kind of strategy to work through the exam paper. 

A really important bit of any exam paper is the examination rubric; the instructions at the front of the exam that tells you which questions to answer.  Ideally, you should be in a situation where you know exactly what it is going to be (since you’ve already had a good look at a number of past exam papers).  The rubric shouldn’t change (it would be very surprising if it did), but it’s always a good idea to read it carefully to make sure you know what it is asking of you.

Reading the rules is really important: what you don’t want to do is to spend too much time answering questions that you don’t need to answer.

If you’re someone, like me, who hates exams, one thing to do is to spend a couple of minutes looking over some of the questions.  This way, you get a feel for what you have to do.  Also, when you see questions you can clearly answer, this will start to put you at ease (and you will know how to answer some questions, since you’ve been revising hard).

2. Pick an easy question to start with if you are stuck

Exams can be pretty stressful.  When we’re under stress, we can experience that feeling when our minds go blank, where we think that we can’t remember anything.  The reality is that we can remember everything that we’ve revised - we’ve just got to find a way to access it.

A really good way to ‘get going’ on an exam is to start with an easy question; a question that you know that you can answer.  As soon as you’ve started to write an answer, you usually start to remember things.

3. Think about and apply your strategy

Time is really important.  Three hours can pass in a flash.  One approach to exams is to try to gather up as many ‘easy marks’ as you can as possible.  This strategy can help you to free up more time to focus on the tougher questions that could take a whole lot longer. 

One good question to ask yourself is which questions you’re going to tackle in what order.  There is no reason why you can’t tackle questions in a different order to the sequence that they are presented in (unless there is a very good reason, of course!)

When I was an undergrad, someone told me that you could ‘break the exam rubric’, which means that there are these instance where you might want to answer more questions that are asked of you.  If there is an exam paper that says ‘answer two out of the five questions that are given’, there’s no reason why you can’t go ahead and answer three, for example. You might choose to ‘hedge your bets’ by perhaps choosing two of your strongest questions, followed by another question that you think you might do well at (providing you have the time, of course!)

If you ‘break the rubric’ what usually happens is that the examiner has to mark everything, and you get the marks for the questions that you do best at.  Don’t worry about making the examiner work.  Make them sweat.  It’s your exam, so you should feel free to answer as many questions as you can answer.  

4. Try to get into the mind of those who wrote the exam

Ask yourself the question: what is it that they’re trying to get at?  The module team will invariably looking for evidence that you understand a particular concept or idea, so try to communicate your knowledge and understandings as clearly as you can.  Write full sentences, use keywords, or leave bullet points.  If it helps, draw a diagram of make a table.  Underline some of the key concepts, for example.

If you’ve revised well, you should be able to see echoes of the module learning objectives within each exam paper.  Working through past exam papers helps you get into the dark and devious psyche of those who wrote the exam paper.  Look at the questions carefully: are they trying to assess your knowledge of a concept, or are they encouraging you to apply your knowledge in some way to solve a particular problem?

5. Write anything

This bit of advice sounds a bit crazy. If you find yourself in a situation where you are not sure how to answer a question, write down anything. What I mean is write down concepts, ideas, or keywords that you think might relate to the question that is being asked. The process of writing down ‘anything’ may trigger other thoughts and memories that may help you to connect with the question that you’re answering.  Plus, if you write down ‘anything that might be connected to the question’, there’s a chance that you might get some marks.  Writing ‘anything’ says to an examiner that you know ‘something’.

The reason why I think this bit of advice is important comes from my experience of being an examiner.  The role of an examiner is always to give you marks, not to take away marks from you.  There are these instances where I’ve seen exam papers that had big sections that were empty.  I always find it sad when I see blank spaces in an exam paper: a blank space or an empty question is a missed opportunity.

Even if you’re not sure what the question means, or what the exam team is looking for, put something down. If you put something down, there’s a chance you might get some marks. If you don’t put anything down, you certainly won’t get any marks. It’s always worth the risk.

6. Use all the time that you have available

The time that you have in the exam hall is your time, so do feel free to use all of it.  Call me weird, but I’ve always sat out an exam to the end.  In the time that you have, don’t miss sections out, take time to check through what you’ve written and ask the question ‘is there anything that I could add to this question to convince the examiner that I know my stuff?’  Sometimes you can discover that there is another way of answering a question, or it’s possible to add a further perspective.  Like I mentioned earlier: make the examiner work!

If you need another couple of answer booklets to present your knowledge and understanding, that’s fine: these should be available.  Put up your hand and ask the invigilator for some more paper.

7. Never cross out big sections

Sometimes you need to make some notes, do some rough working out for a maths or engineering problem, or write a short essay plan.  If need to make some ‘notes’ about anything, never cross them out.  Instead, leave a comment to tell the examiner how your notes are connected to a particular question.

If you put a line through a section of writing on your exam script, this tells the examiner that you don’t want a particular section to be considered as a part of your answer.  Let the examiner decide what is important and what is not: give them the opportunity to give you marks.

8. Go celebrate!

Do you remember that I mentioned ‘goal setting’ in the last blog?  Set a goal to do something after you’ve completed an exam.  Exams are stressful and take up a lot of time and energy.  When you’ve finished, go and do that something special that you promised yourself: you deserve it!

And finally…

These tips work for me.  Different people, of course, will have very different tips.  Why not ask other students what they do?  Also, there are some really good resources out there that you might find useful. 

One resource that I really recommend is The Good Study Guide by Andrew Northedge.  If you have access to this book, check out chapter 12, ‘preparing for examinations’.

Another set of really good resources is the Revising and Examination section of the Skills for Study website.  Do take a bit of time to go through these resources.  It contains some really good ideas.

Once again, good luck in your exams!

Permalink Add your comment
Share post
Christopher Douce

Chris’s Exam tips – Part 1: Revision

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 8 May 2022, 14:44

I hate exams, and I’m sure there are many people out there who have exactly the same view, but I’ve come to see them as a necessary evil.  The more exams I’ve taken in my life, the less I’m fazed by them.

This series of two blog posts represents a summary of all the bits of revision advice and exam related tips that I have used and discovered during my career as a student, both within the Open University and at other institutions. This first post is all about revising.  The second post offers some tips for when you take your exam.

If you’ve got an exam in the not so distant future, I hope these two blog posts are useful!  Also, feel free to print out, share, or do whatever you want with these tips.  If you find them useful, that’s great!  Do feel free to send me a message; it’ll be great to hear from you.

1. Get organised

The moment you know the date of an exam, put it in your diary so you know when it is.  The very act of recording it will help you to remember it.  This date is usually immutable; it cannot be moved.  Locate all your study material, and find a good place to study.  Also, have a think about the times and places when you can study.

2. Make notes of module materials

When you’ve sorted everything out, you might be a bit surprised with how much course materials there is.  Make notes of the key sections in a module.  The process of making notes will make things go into your head.  A really good technique is to paraphrase (or rephrase) concepts; putting things in your own words is an invaluable skill, and it’s exactly this kind of skill that you will need to use in an exam.

If you have a set of notes for a block, or section of a module, try to makes notes of your notes.  Try to make notes so you can represent the entire module on a single A4 sheet of paper.  When you’ve done this, all the other details will be in your head (so you won’t need any more materials).

If you’re doing that commute, considering writing some of the key ideas on cards.  Create notes in whatever form works for you.

3. Little and often

I remember being told this tip when I was preparing from my GCSE exams: don’t try to cram, instead, try to revise in ‘chunks’.  This is great advice.  Cramming, or trying to desperately to learn everything the night before can just leave you worried and tired.  If you’re tired from reading course materials at 3am in the morning, the likelihood is that you’re not going to give your best exam performance.

Instead, try to revise when you can.  Figure out what ‘dead time’ you have in your day, and try to use the time that you find. 

When I was revising, I always set myself an overly ambitious goal of, say, revising for an hour.  If I’m being honest, I never really managed to do an entire hour: my mind would start to drift away from the subject I was studying; I would start to look out the window, think about my shopping and wonder whether I ought to do my hoovering.   It’s at that point you need to get away from the books because your brain is telling you something: you need a break.  A good break, where you do something different, will help you to remember more stuff.

4. Find the best time (and place) that works for you

Some people are early birds, other people are night owls.  Which are you?

Knowing what time of the day works best for you can really help you to get on top of everything.  If you find that you work best in the morning, one thought is to make sure that you’re ready to ‘hit the books’ in the morning when you wake up.

Finding a good revision space is really important; find somewhere that is quiet and free from distractions.  Let your friends and family know what you’re doing so they know not to disturb you (except for the times when they pop in to deliver a supportive cup of tea or coffee).

5. Think about creating a revision timetable (or habit)

This really is one of those ‘take it or leave it’ tips.  My own view is that life is too short to make a timetable, but what is important is the creation of a revision habit.  I used to try to revise at exactly the same time every day, between 10 and 11 in the morning.  I didn’t do this because I was obsessive or weird (although some people might argue otherwise!), it was because I knew that that this was the time when I was more likely to remember and understand things.

The thing with creating a timetable or creating a habit is that when you have either of these, not doing what you said that you would be doing has the potential to make you feel guilty (and guilt, of course, can be a motivator to prevent you from doing the same bad things again).  Another idea is to tell other people that you’re doing revision at a particular time, so they can then ask you the inevitable question of, ‘shouldn’t you be doing the revision that you told me about?’ when your housemate or partner catches you watching some game show on the television.

6. Find as many exam papers as you can stomach

Can you get access to past exam papers?  If so, try to get hold of a good number of them.  Don’t go too mad and download loads.  If you get hold of too many, there’s a good chance you might lose the will to live and start to judge your own sanity. Instead, go back between 3 to 5 years, depending on the module.  If you go back any further there’s a possibility that your module or course might have changed, and you wouldn’t want to waste your time looking at exam papers that are not going to be useful.

When you’ve got a past exam paper, give it a read through; get a feel for its structure and how it’s laid out.  The last thing you want is for the exam paper to surprise you on the day of your exam – this could happen if you don’t take the time to prepare.

When you’re looking at the papers, ask the question, ‘what learning do you think the examiners are trying to look for?’  If your module has published any learning outcomes, try to relate or connect these to the outcomes.

Work through a number of past exam papers from beginning to end, but hold one back (you could use one exam paper as a ‘timed exam’, which you could time yourself).  When you’re working through the exam paper, refer to the module or course materials – try to write the best answer you can.  If there’s something that you don’t quite understand, try to take this opportunity to learn what it is the exam questions are asking about.

If you get stuck, ask your tutor for help and guidance.  They don’t mind answering questions.  In fact, they’re paid to answer your questions, so feel free to use them as a resource.

7. Figure out your personal exam strategy

Now that you’ve got a whole bunch of exam papers, ask yourself what bits of the exam are you most comfortable with.  What does the rubric at the front of the paper say?  (The text at the start of a paper that tells you what questions you should answer).  Do you have to answer every question, or do you have a choice?  If you have a choice about what to answer, this might suggest what aspects of revision to concentrate on.

If there’s a part of an exam paper that you find either confusing or boring, think about whether there are other parts that might be a bit more interesting.  Think about what questions motivate you.  The more motivated you are, the better your revision will go.

Time in the exam hall is limited.  This means that you might also want to think about which questions you might be able to answer more quickly than others.  If you think about your ‘exam answering strategy’ before you see the real exam paper, you’ll feel a whole lot more comfortable on the day.

8. Seek out examination preparation tutorials or events

I’ve got a confession to make: it takes me ages to read things, and when I do, I don’t always take in what is written down.  I find that I can learn a whole lot better if I listen to either audio recordings or, better still, go to lectures.

If your module has got any examination preparation or revision tutorial events scheduled, then do what you can to go to them.  They are invaluable.  Also, the fact that you choose to sit down in a room with a bunch of other people for a few hours means that you can’t easily escape: you have no choice but to concentrate on what is happening within the tutorial or day school.

Another benefit of going to revision preparation sessions is that you can often be motivated by other students who are in a similar situation as you. 

Use your tutor; ask him or her as many questions as you need to.  Tutorial revision sessions can be invaluable.  Even if you have to travel quite a long way to get to one of them, do your best to get to one.

9. Invent your own exam questions

Each module is divided into a number of different blocks.   Each block usually contains a set of learning objectives.  These objectives are really useful since they are connected to both the assignments that you complete, and your end of module exam.

If you’ve worked through all your past exam papers and you’re starting to become thoroughly bored, why not look at the learning objectives and try to invent your own exam questions?  This is a really great thing to do; you stretch not just your creativity but also your knowledge of the concepts that are presented within your module.

10. Set a personal goal

When you’re revising, set a goal.  When you’ve achieve a goal, reward yourself with a treat.  The treat could be just about anything, but make sure that it is personal to you and your life.  Your treat might range from a walk to your local park, or perhaps as something as simple as a cup of tea and a biscuit after you’ve waded through a whole chapter on Java programming, for instance.

The bigger the piece of revision, the bigger the treat could be.  If you’ve finished going through an entire block from beginning to end, then choose something that makes you feel happy.  Revision is work, and it’s hard work, right?  If you have something to look forward to, it might become slightly easier.

11. Make time for yourself

I think this tip is really important: be good to yourself.  Taking exams is a tough business, and you’re the only one who is going to get through them; it’s really important to take care of yourself.   Don’t forget to have fun.  You’re not going to be able to revise to the best of your abilities if you’re always stressed or worried.

I remember this time when I was a second year undergraduate student.  The second year results are, of course, really important when it comes to your overall degree classification.  I had been revising hard, and if I’m honest, I was pretty sick of it.  There was some parts of my studying that wasn’t going very well, and I was getting a bit frustrated and taking it out on other people who were around me.

In the middle of the revision period, I decided to go out to the student union to have a bit of fun.  It wasn’t as busy as it usually was (because everyone was revising), but it turned out to be a brilliant night.  The next day, I felt a whole lot more relaxed: I needed a bit of time out.

I guess what I’m trying to say is remember to find time to relax; try to forget about all your revision for a bit.  Do whatever it is that you need to do so that you can relax.  We all do different things, so find that thing that works for you.  It helps you gain a bit of perspective.

12. Prepare for the day

Taking a bit of time to look into the practical issues that surround your exam is going to take a lot of stress away when it comes to the day.  A big question to ask really early on is: where is your exam going to be taking place?  Follow up questions are: how can I get there, and how long is it going to take me.  Since I hate being late for things, I usually plan to get a train or a bus that is a before the one that I should catch if I wanted to arrive on time.  Usually, it means that I get to places ridiculously early!

There are a couple of advantages about being ridiculously early: you, of course, add a bit of contingency into your journey if things go wrong (if the train is delayed), plus, it gives you time to check out the exam centre and become familiar with the place.  Being early (but not too early!) can give you some precious relaxation time.

If you’re someone who needs to do some last minute reading before the exam, it’s okay to do this, but do constrain yourself to your notes, rather than huge text books (which are just going to stress you out).

Also, make sure that you’ve got everything you need: make sure that you’ve got all the pens, pencils and erasers that you might need.  A really good tip that I was told once was: ‘always take a spare pen with you; the last thing that you want is to start panicking if your pen stops working when you’re in the exam hall’.  Make sure that you have a bottle of water and maybe even something to each.  One of my guilty secrets is that I’ve taken a chocolate bar into some exam halls so I can get an energy boost.  One word of advice is: don’t take crisps into exam halls.  Invigilators get very unhappy about crisps!

And finally…

These are my tips, and other people will say different things and give different advice.  Also, some people will, no doubt, disagree with some of these thoughts – but, that is okay: we’re all different, and we all study in different ways.

I’ll leave with a question: what is your favourite revision tip?  If someone asked you for the single biggest tip that you would give someone, what would it be?

Good luck with your revision!

Permalink Add your comment
Share post
Christopher Douce

First MCT Student Support Team Conference

Visible to anyone in the world

I drafted this blog post some time ago, but it got temporarily shelved due to the reality of day to day work.  I never forgot about it, though!  I still feel it’s important to share, since it relates to an important (and on-going) change within the university: the development and implementation of the Faculty of Mathematics Computing and Technology (MCT) Student Support Team (SST).

Introductory bit

Nineteenth of July 2014 was the date of the first ever Faculty of Mathematics Computing and Technology (MCT) Student Support Team (SST) conference that was held at The Open University campus in Milton Keynes.  Although this blog is written primarily for my colleagues, fellow tutors and students might find this summary useful since everyone has been subject to different amounts of change within the university.

The SST comprises of learning support staff (who are based in Birmingham and Nottingham), associate lecturers and central academics.  The idea behind an SST is to create a grouping of people who have more detailed knowledge of how to support students who are studying a particular subject, so we can improve the way that students are supported.  The ideas behind SSTs predate the increased focus on programmes of study due to the availability of student loans for part time students.

Although the SST staff for MCT are based in two regions, all the other Open University regions remain fundamentally important to the operation of the university: they remain centres from where tutorials and day schools are run, outreach events are facilitated, and students can have additional support sessions with tutors (and students can drop in to look at module materials).  They are also essential places from where continued AL development and training occurs.  Without these centres, tutors would not have sufficient training to allow them to offer excellent teaching and learning experiences to their students.

This blog starts with a summary of the plenary or introduction session, and is then followed by a session about retention.  There is then a brief description of session that was specific to the department of Computing and Communications.  This is then followed by an AL development session about disabled student services (this might be of interest to some students who study H810).  The day ends with a session about ‘tutor staff development’.

Introduction plenary

Our former Associate dean for regions and nations introduced the conference and welcomed us all to the SST.  We were shown a bunch of graphs that gave us a bit of context.  I remember that one of the graphs was about the number of students who are studying at a study intensity that is equivalent to full time students (which is a way to allow the OU to be compared to other institution).  In terms of full time equivalent (FTE) students, the number of students in MCT is broadly similar to the number of students in the Faculty of Art and the Faculty of Social Sciences.

An important point is that undergraduate computing and IT modules currently represent the biggest group in the faculty with 60% students registering for a BSc in Computing and IT.  By way of perspective, there are 13K FTE students in the whole of MCT, whereas Birkbeck (as a whole institution) has a total of 17K students.  We were given even more mind blowing stats: there are 1,200 associate lecturers in MCT, who teach a total of 2,900 contracts.

Retention and progression (of students between different modules) is considered to be an important strategy.  One figure that I made a note of was that there was a target of 75% of all students moving to the next module (unfortunately, I didn’t note whether this was related to level one modules only).

Our associate dean also spoke about other university priorities, such as helping to design student interventions (to make study easier), the student induction process (to help students become more familiar with the process of studying) and the development of the OU virtual learning environment (to make smart use of data).  On the subject of interventions, one great intervention that I heard was the SST proactively calling students discuss their study intentions if they have registered for an excessive number of modules.

One of the most important elements of OU study is the relationship between a student and their tutor.  The situation used to be that a student was left to their own devices at the end of a module.  The SST can now talk to a student between modules.

We heard about future plans.  Apparently there is some work afoot to improve the induction process.  For quite a while, I’ve heard about a new induction website, but I really do think there’s an opportunity to do more to help students become familiar with how to ‘survive’ when being a distance learning student. 

Future university activities include an attempt to diversity income, and continuing alignment of associate lecturer services to student services.  Also, quite recently, there has been internal debate regarding a new group tuition policy (which I hope to share some information about to ALs as soon as the details have been defined).

MCT Level 1 retention review

The next part of the day was by a colleague who spoke about a retention review project.  A really interesting fact I learnt was that module presentations that start in October have slightly higher retention figures than modules that begin in February.  I have no idea why!

I’ll do my best to summarise some of the key points that were made.  A really important concern was how well students are prepared for study.  If a student comes to the university without have any prior qualifications, this means that they less likely to complete.  There might be a range of different factors that influence this, such as available study time, workload, and other points, such as a lack of confidence (writing from experience as a former OU student, it takes time to ‘figure out’ how to consume the learning resources that we get from the university.

The one really important point that that has made the difference is: our tutors.  Proactive contact with an associate lecturer can really help to improve retention.

I’ve made a note of some recommendations: it’s important to have consistent module advice, we need to have effective subject specific advice, and have a solid induction programme to incorporate the development of on-line study skills.  The characteristics of a module and its design can make a different too: module teams to consider workload, to carefully consider assessment, and study events should be designed in to the module structure.

According to my notes, the review also offered some really practical suggestions, such as the SSTs could explicitly contact students without previous qualifications (but there’s also a tension between balancing the need to answer the phone in response to incoming student queries, and the willingness to initiate positive interventions: everything has a cost, and everything takes time).

I also made a note of the importance of student support:  AL development is to prioritise soft skills training for ALs (and perhaps points such as using the phone).

Over the last year, the university has made available something called a ‘student support tool’, which brings the associate lecturer closer to the systems that the university uses.  Whilst such tools are great, there is a question which needs to be asked, which is: how can any tool be used effectively.  Again, it’s back to the importance of helping to train and develop our brilliant associate lecturers to make sure that they are as well-equipped as they can ever be to support their students.

Computing and IT session

The next session in the day was by John Woodthorpe, academic lead for Computing and IT.  John opened the question by asking us a question:  ‘Put up your hands if you’re a member of the SST!’  His point was simple: we’re all members! We all have an important role to play.

John asked a rhetorical question about what the members of the SST in Birmingham and Nottingham are doing.  He answered this by providing a long list of activities.  The helped to book and organise exams at different venues, offered study and course choice advice, wrote disability profiles, and were always developing relationships with different members of the faculty.

During the session I made note of the phrase: ‘pre-emption, proactivity, prevention’, which seems like a good way to encapsulate different important aspects of study support.

One gap that has been identified is the need to develop a more comprehensive and structured approach to resubmission support, i.e. what happens if a student doesn’t pass an exam.  At the moment, students could call the SST and request a special session, but not all students know that they may be entitled to this support.  The SST is trying to tighten up resubmission support, aiming to offer more support for students beyond the boundaries of individual modules.

Disabled Student Services

After the faculty session, we were offered a number of different session choices.  I chose a session which was all about disabled student services.  I was interested to hear how the new SST would help MCT students who have disabilities.

We were introduced to different models of disability: the medical model and the social model.  A really interesting point was that people who have disabilities can have a ‘fluid sense of identity’.  What I think this means is that some disabilities may be transitory (such as a physical injury, such as a broken leg), and others may be episodic (such as ME).

We were also introduced to the Equality act 2010, which identifies a number of protected groups.  We were also told about different types of discrimination.  These are direct and indirect discrimination (which I had heard of), but there is also discrimination by association, i.e. discrimination might arise against someone if they have significant caring responsibilities.

We had a discussion about on-line tutorial materials, reasonable adjustments, and the provision of advice and guidance (of which, module accessibility guides play an important role).  We also spoke about the importance of contacting students directly to enquire about the extent of any adjustments that might be necessary, and also the role of examination boards.

Another area that was covered was the Disabled students allowance (DSA).  The DSA covers three areas: ergonomic furniture, personal support (such as non-medical helpers), and travel.  The DSA no longer funds standard electronic devices such as computers since they are now deemed to be an essential part of academic life, but funding for upgrades are permissible.  One really important point is that DSA is only available for students who are studying 30 points and a registered for a qualification (in Scotland this is 60 points).

Non-medical helpers can be really helpful for some students; they can make the difference between a module being accessible and a module being inaccessible.  A non-medical helper always works on behalf of the student: there is a contact between the student and the helper; tutors (or the university) are not able to speak directly to a helper – this relationship is fully controlled by the student.

AL Staff Development: The art of the possible

This final session of the day was all about how we should collectively think about associate lecturer development now that we’re in the new world of the student support team.  This session was facilitated by fellow staff tutor and tutor, Frances Chetwynd.

Associate lecturer broadly takes on two types: generic development (which is about how to perform teaching and learning, to make use of technology, and to be aware of university initiatives and developments), and module specific (sharing good practice about teaching certain subjects).  Now that we’re moving into broad teams, there are opportunities in terms of large subject (or, even programme) specific events.

Like all good tutorials, Frances’s session was very interactive.  We were given sets of post it notes to propose new ideas.  Pink post-it notes were about face-to-face AL development ideas, orange post it notes were about on-line (or other) types of development sessions.  When we had written down our ideas, we stuck them on different sheets of paper that had broad different category headings.

Frances also gave us ten stickers (which were in the form of ‘gold stars’), allowing us to collectively vote for the ideas that we liked best. 

The ones that I’ve noted down are: a session about ‘coding at different levels’ (which could be like some kind of internal symposium), Teaching Sense on TU100 (apparently there is a 10% tutor churn on TU100), making sure that we Record module briefings, and sessions about qualification overviews (or, future plans for modules).  There were loads more ideas, but I haven’t noted them all down.

Reflections

It was a busy day!  One thing that was really good about the conference was that it put the work of the SST and the MCT faculty in context (I had not heard of some of the numbers that our associate dean mentioned) and also helped to emphasise how much change has been going on within the university in the last few years.  It almost allowed us to ‘take stock’ of where we were. 

The session on AL staff development really got me thinking.  I came up with a number of ideas about different types of events that might potentially help tutors from across the UK who tutor on specific modules.  One thing that keeps coming to mind is that there is also more potential in terms of how the student support team members can feed directly into module teams.  I can’t help but feel that these new structures can help different bits of the university to work closer together, and that can only be a good thing.

All this said, it’s still very early days for the student support team and I guess I’m one of many who are still finding their feel in terms of what we do, how we work, and how we communicate with each other.  It also strikes me that although the university has been reorganised along slightly different lines, the notion of a geographic region or ‘place’ remains very important.  The development sessions that are run within our regional centres for our associate lecturers are invaluable.  Developments in information and communication technology has made the SST possible, which means now it’s up to us to figure out how to best take advantages of the opportunities it offers us.

Permalink Add your comment
Share post
Christopher Douce

TM354 Software Engineering: briefing

Visible to anyone in the world
Edited by Christopher Douce, Monday, 11 Sept 2023, 16:27

On Saturday 27 September I went to a briefing for a new OU module, TM354 Software Engineering.   I have to secretly confess that I was quite looking forward to this event for a number of reasons: I haven’t studied software engineering with the OU (which meant that I was curious), I have good memories of my software engineering classes from my undergraduate days and I also used to do what was loosely called software engineering when I had a job in industry.  A big question that I had was: ‘to what extent is it different to the stuff that I studied as an undergrad?’  The answer was: ‘quite a bit was different, but then again, there was quite a bit that was the same too’.

I remember my old undergrad lecturer introducing software engineering by saying something like, ‘this module covers all the important computer stuff that isn’t in any of the other modules’.   It seemed like an incredibly simple description (and one that is also a bit controversial), but it is one that has stuck in my mind.  In my mind, software engineering is a whole lot more than just being other stuff.

This blog post summary of the event is mostly intended for the tutors who came along to the day, but I hope it might be useful for anyone else who might be interested in either studying or tutoring the module.  There’s information about the module structure, something about the software that we use, and also something about the scheduling of the tutorials.

Module structure

TM354 has three blocks, which are also printed books.  These are: Block 1 – from domain to requirements, Block 2 – from analysis to design, and Block 3 – from architecture to product.  An important aspect to the module is a set of case studies.  The module is also supported by a module website and, interestingly, a software tool called ShareSpace that enables students to share different sketches or designs.  (This is a version of a tool that has been used in other modules such as U101, the undergraduate design module, and T174, an introduction to engineering module).

Block 1 : from domain to requirements

Each block contains a bunch of units.  The first unit is entitled ‘approaches to software development’, which, I believe, draws a distinction between plan driven software development and agile software development.  I’ve also noted down the phrase ‘modelling with software engineering’.  It’s great to see agile mentioned in this block, as well as modelling.  When I worked in industry as a developer, we used bits of both.

The second unit is called requirements concepts.  This covers functional requirements, non-functional (I’m guessing this is things like ‘compatibility with existing systems’ and ‘maintainability’ – but I could be wrong, since I’ve not been through the module materials yet), testing, and what and how to document.  Another note I’ve made is: ‘perspectives on agile documentation’.

Unit three is from domain modelling to requirements.  Apparently this is all about business rules and processes, and capturing requirements with use cases.  Prototyping is also mentioned.  (These are both terms that would be familiar with students who have taken the M364 Interaction Design module).  Unit four is all about the case study (of which I have to confess that I don’t know anything about!)

Block 2: from analysis to design

Unit five is about structural modelling of domain versus the solution.  Unit six is about dynamic modelling, which includes design by contract.  Unfortunately, my notes were getting a bit weak at this point, but I seem to remember thinking, ‘ahh… I wonder if this relates to the way that I used to put assertions in my code when I was a coder’.  This introduction was piquing my interest.

Unit seven was entitled, ‘more dynamic modelling’, specifically covering states and activities, and capturing complex interactions.  Apparently the black art of ‘state machines’ are also covered in this bit.  (In my undergrad days, state machine were only covered in the completely baffling programming languages course) .  Unit eight then moves onto the second part of the case study which might contain domain modelling, analysis and design.

Block 3: from architecture to product

This block jumped out at me as being the most interesting (but this reflects my own interests).  Unit nine was about ‘architecture, patterns and reuse’.  Architecture and requirements, I’ve noted, ‘go hand in hand’.  In this section there’s something about architectural views and reuse in the small and reuse in the large.  During the briefing there was a discussion about architectural styles, frameworks and software design patterns.

When I was an undergrad, software patterns hadn’t been discovered yet.  It’s great to see them in this module, since they are a really important subject.  I used to tell people that patterns are like sets of abstractions that allow people to talk about software.  I think everyone who is a serious software developer should know something about patterns.

Unit ten seems to take a wider perspective, talking about ‘building blocks and enterprise architectures’.  Other topics include component based development, services and service oriented architectures (which is a topic that is touched upon in another module, and also potentially the forthcoming TM352 module that covers cloud computing).

Unit eleven is about quality, verification, metrics and testing.  My undergrad module contained loads of material on metrics and reliability, and testing was covered only in a fairly theoretical way, but I understand that test-driven development is covered in this module (which is a topic that is linked to agile methods).  I’ll be interested to look at the metrics bit when this bit of the module is finalised.

The final unit takes us back to the case study.  Apparently we look at architectural views and patterns.  Apparently there are also a set of further topics that are looked.  I’m guessing that students might well have to go digging for papers in the OU’s huge on-line library.

Software

I’ve mentioned ShareSpace, which is all about sharing of software models with other students (modelling is an important skill), to enable students to gain experience of group work and to see what other students are doing and creating: software development invariably happens in teams.  Another important bit of software is an open source IDE (integrated development environment) called NetBeans.  I’m not sure how NetBeans is going to be used in this module, but it is used across a number of different OU modules, so it should be familiar to some TM354 students.

Assessment

TM354 comprises of three tutor marked assignments, a formative quiz at the end of every unit (that students are strongly encouraged to complete), and an end of module exam.  The exam comprises of two parts: a part that has questions about concepts, and a second bit that contains longer questions (I can’t say any more than this, since I don’t know what the exam looks like!)

Tutorials

Each tutor is required to deliver two hours of face to face tuition, and eight hours of on-line sessions through OU Live (as far as I understand).  In the London region, we have three tutors, so what we’re doing is we’re having all the groups come to the same events and we’re having each tutor deliver a face to face session to support students through every block and every TMA. 

We’re also planning on explicitly scheduling six hours of OU Live time, leaving two hours that the tutor can use at his or her discretion throughout the module (so, if there are a group of students who struggle with concepts such as metrics, design by contract, or patterns, a couple of short ad-hoc sessions can be scheduled). 

All the OU Live sessions will be presented through a regional OU Live room.  This means that students in one tutor group can visit a session that is delivered by another London tutor.  The benefit of explicitly scheduling these sessions in advance is that all these events are presented within the student’s module calendar (so they can’t say that they didn’t know about them!)  All these plans are roughly in line with the new tuition strategy policy that is coming from the higher levels of the university.  A final thought regarding the on-line sessions is that it is recommended that tutors record them, so students can listen to the events (and potentially go through subjects that they find difficult) after an event has taken place.

A final note that I’ve made in my notebook is ‘tutorial resources sharing (thread to share)’.  This is connected to a tutor’s forum that all TM354 tutors should have access to.  I think there should be a thread somewhere that is all about the sharing of both on-line and off-line (face to face) tutorial resources.

Permalink Add your comment
Share post
Christopher Douce

Teaching and learning programming for mobile and tablet devices: London Metropolitan University

Visible to anyone in the world

On 24 July 2014, I went to a Higher Education Academy sponsored event at London Metropolitan University.  The event was all about programming mobile devices, and it was the third time I had been to this event.  The previous time I went along, I spoke about a new module: TT284 Web Technologies (OU website).  This time I had two purposes: to share something about the beginnings of a new module TM352 Web, Mobile and Cloud (or, more specifically, its main objectives) and to learn what other institutions are getting up to.

A case study…

The first presentation of the day was by Yanguo Jing from London Met (who has organised the event) and Alastair Craig.  They presented ‘a case study of the delivery of a year 12 summer school on mobile app development’ (I had to ask what ‘year 12’ meant: and it means 16 or 17 year olds…): this was a part of an outreach event that London Met run (where students were selected random to participate).

They described some of the challenges that they faced.  Firstly, the students who joined the summer school sometimes had no programming knowledge, and they had to make the summer school fun.  A really big challenge was to try to scaffold the learning so that the students could create something presentable by the end of the week.

At this HEA event last year, a new programming system called TouchDevelop was introduced.  TouchDevelop is a ‘touch friendly’ programming language from Microsoft Research.  (You can check out the kind of apps that have been created by visiting the apps section of the TouchDevelop site).

The language features a touch screen programming interface that is especially design to work with mobile devices; it allows users to choose only the programming constructs that can be selected (it is also graphical in the same sense that Scratch is).  One really interesting aspect of the system is that you don’t have to install anything.  TouchDevelop also creates HTML 5 code, which means that it can be run on a wide range of different devices.

The summer school lasts for a week.  The summer school begins with an introduction to the tool and a discussion of syntax.  The next two days are all about the basics of a game and the game engine.   The fourth day the students are asked to create their own game, and on the fifth day, students are asked to present their games to each other.  Masters level students acted as supervisors. One point was that it seemed that some students (who had some prior programming experience, invariably using Scratch) got ahead with everything.

A fundamental question is, ‘how do you teach people in 18 hours when you don’t know what they know?’  The trick, apparently, is to get them to do things.

Some discussion questions were: ‘is it a good idea [to run this kind of summer school]?’, and ‘does your department do something similar’, and ‘how might you scale up this type of outreach activity?’

One thing that I learnt from the discussion is that there is a new version of Scratch available.  This first presentation ended with a discussion about MOOCs, and the point was made that MOOCs are very different to outreach.

Considering the cloud: teaching mobile, cloud computing and the web

The second presentation of the day was by yours truly.  The aim of the presentation was to talk about some of the areas that a new module about cloud computing may (or may not) cover.  Towards the end of the presentation, I asked all the delegates the following questions:

  • What do you think needs to be taught (cloud, mobile, web?)
  • How might you teach these concepts?
  • What might the challenges be?
  • How might you carry out assessments?
  • How do we protect and inform about change?

As everyone discussed these questions, I made a few notes.  One of the fundamental challenges (with an OU course) is to choose technologies that are not going to age quickly.  ‘The cloud’ is a really fast moving area where there appears to be continual change and innovation; new software services and releases are coming out all of the time.  One way to counter against this is to teach the underlying concepts and not just information about the services.

Another approach is to perhaps concentrate on building a learning community.  Developers and technical specialists invariably live within a community that shares technical knowledge and expertise.  It might be interesting and useful to expose learners to the dynamics of these environments.

An interesting point was both mobile and web platforms are just different ways to consume resources.  Increasingly the ‘web’ is being equated to HTML 5, and HTML 5 is increasingly being embedded within mobile devices.

On the subject of teaching, one delegate made a really interesting and relevant point.  He said, ‘I’ve given up lecturing… half of them just turn off’.  When it comes to teaching the development of mobile apps the thing to do is to split students in to small groups; it is the learning by doing that really counts.

When it comes to assessment, one delegate said, ‘you’ve got to have a project – if you can’t develop an app, then you fail’, and it’s important to get continual updates on progress.  Other approaches might include the use of computer marked multiple-choice questions, and writing about the bigger reflections and lessons from the module.

Poster session

By way of a brief interlude, Yanguo introduced a series of posters that had been put on the wall of the meeting room.  The posters were all about different apps that students had created.  There were two indoor navigation apps, an app for parking (which made me remember one of my blog-rants about poor interaction design), some kind of ‘cash register’ virtual payment app, a food checker or testing app, and a museum guide app.

Bringing the cloud into the classroom

The third presentation of the day was by Paul Boocock, from Staffordshire University.  Paul mentioned that undergrad students are introduced to a range of different platforms: iOS, Android and Windows (if I’ve understood things correctly).  For postgraduate students, there are a number of interesting sounding modules, such as Android app development and Advanced location aware app development.  These link into different mobile technology postgraduate qualifications (Staffs University), such as their Mobile Device Application Development MSc, Postgraduate Certificate (PgCert) and their Postgraduate Diploma (PgDip).

One of the big recent changes to their curriculum is that Staffs is now including ‘the cloud’ into the different mobile modules.  One thing that I should mention is that the concept of ‘the cloud’ is understood in terms of public clouds (as opposed to private clouds that are hosted by the university).

Paul treated us with some pictures of data centres, and said ‘[the cloud] is changing how we teaching this stuff’.  He left us with an interesting idea: ‘what used to take 30 days to get up and running can now be achieved in 30 minutes’.  The point was simple: you no longer need to buy, configure and commission servers.  The benefits of ‘the cloud’ include potential lower costs, scaling and the potential of gaining global reach.  In some respects, it might become more difficult to become more directly exposed to the physical hardware that runs systems.

We were introduced a term that was unfamiliar to me: cloud computing patterns.  The term relates to the way that cloud systems can be consumed as opposed to how they are designed.  Some patterns include on/off, i.e. an application might experience high levels of demand for a while (a bit like batch jobs), that a product or system might take off very quickly (so there would be increases in demand), or there might be predicable or unpredictable bursts of traffic (such as within computer games, for example).

Paul also talked about different platforms.  He mentioned a good number that I had heard of (but I’m not intimately familiar with).  These were Amazon (of course), Microsoft, Rackspace, HP Public cloud, and Google Cloud.  Given that his focus was on public clouds for teaching purposes, he discounted HP and Rackspace (I think due to cost), and then considered Amazon.

Amazon apparently offer something called educational grants (Amazon website), which allow educators to gain free credits to allow computing students to use their services.  The trade is that students who use the Amazon systems will be able to take their skills directly into the workplace.  Apparently, you can tell them how many students you have, and then they sort out the number of licences (or credits).

We learnt that Microsoft (of course) run a similar scheme, which enable students to use Azure academic passes (Microsoft Azure website).  Google was not considered as an alternative since there are no current discounts for non-profit organisations.  In the case of Staffordshire, Paul opted for Microsoft mainly because they had already made an investment into Microsoft tools and environment.

Before a live coding demo, which featured a pre-built service (from what I’ve noted) we were given a brief description of the different Azure components (or Azure services).  These were: compute, app services, data services, and network (this reminded me that I’ve come across similar terms when looking at the open source equivalent called OpenStack).

At the end of Paul’s session there was a lot of time for discussion.

Points of discussion included the challenge of working with different SDKs, and the emphasis on design patterns.  On the masters course, student were asked to create an interactive chat app that wasn’t not too dissimilar to the hugely popular WhatsApp.

Of course, there are always challenges that educators need to be mindful of.  These include the need to change modules without increasing their difficulties, and the question of how to assess everything if everything exists in the cloud (and students create services using lots of template code).  One way to do this is, of course, to ask students to write a reflective report about what they did to get a sense of what they understand.

All in all, it was both really interesting and really useful to know how another institution had successfully tackled the introduction of programming the cloud into their computing curriculum.

Developing digital literacies

The fourth talk of the day was by Terry McAndrew, which had the subtitle, ‘how students can quickly create interactive media resources for your curriculum’. Terry spoke about the broad subject of ‘digital literacy’ which can be defined as ‘the ability to effectively engage with a range of digital technologies to create, navigate and manipulate information’.  Terry mentioned a resource known as a JISC Digital Literacy InfoKit (JISC website).   The key contains seven different areas, which are: information literacy, media literacy, communication and collaboration, career and identify management (which I understand to be a new bit), ICT literacy, learning skills and digital scholarship.  A two year digital literacy programme (JISC) was also mentioned.

Interestingly, Yanguo mentioned some digital literacy resources that were available from London Met.  There’s also another bunch of digital literacy resources from the University of Southampton.  All these different resources made me realise that perhaps this is an area that I really need to catch up on.

Another part of Tony’s presentation centred upon accessibility.  Tony mentioned a tool called Xerte (University of Nottingham) which can be used to create accessible digital material which can be delivered through a virtual learning environment to different devices.  It’s a tool that is sometimes used by students who are studying a module that I tutor, H810 Accessible online learning: supporting disabled students (OU website).  The content that is delivered is presented using HTML 5, but the editor uses Adobe Flash (we were, however, told that there are plans afoot to develop an HTML 5 based editing environment).

Two other interesting links (and projects) mentioned were JORUM, a repository of digital educational material that can be shared between different institutions.  JORUM has been going for quite a while, and I hadn’t heard it mention for quite some time.  Having a quick look at the JORUM site quickly tells me that it has changed quite a bit since I first looked at it properly (which must have been around six or seven years ago).  The second reference was to a project called ACTOER, which is an abbreviation for Accessibility Challenges and Techniques for Open Educational Resources (of which Terry, who is based at TechDis, is the project manager).

I enjoyed Terry’s talk, and I found his presentation of different digital literacy resources useful, but there was little about the learning and teaching of how to program mobile devices.  This said, accessibility is always really important, and it’s something that designers of curriculum need to always be mindful of: I welcomed Terry’s reminders.

Alignment of mobile learning agenda with learning and teaching strategies in HEIs

The final presentation of the day was by Remy Olasoji from the University of East London.  From what I remember, I understand Remy to be an expert in the field of requirements engineering.  He presentation was about taking lessons from requirement engineering to try to understand how best to make use of mobile technology.

A final question of the day was, ‘how do we drive the mobile agenda forward?’  A simple answer was: ‘mobile is already happening – it’s driving forward of its own accord’.  One challenge lies with figuring out how to teach the fundamentals of mobile technologies to enable students to be thoroughly equipped and prepared when they have to work with new and changing devices.  Another challenge lies with figuring out how to best make use of devices to help students with their studies.

Reflections

All in all, a useful event; it’s always useful to hear what happens within other institutions and to learn about what challenges educators need to overcome.  One area that I would like to have heard more discussion about is information and data security.  The ‘cloud’ exposes these issues quite naturally, along with issues that relate to business and management.

Permalink Add your comment
Share post
Christopher Douce

New Technology Day - June 2014

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 8 Oct 2019, 17:42

This post is a quick summary of a New Technology Day event that took place at The Open University London regional centre on Saturday 14 June 2014.  I’ve written this post for a number of reasons: for my esteemed colleagues who came along to the day, so that I help to remember what happened on the day, so that I can share with my bosses what I’m getting up to on a day to day basis, and for anyone else who might be remotely interested.

One of the challenges that accompanies working in the area of technology, particularly information technology and computing, is that the pace of change is pretty relentless.  There are always new innovations, development and applications.  There are always new businesses doing new things, and using new techniques.  Keeping up with ‘the new stuff’ is a tough business.  If you spent all your time looking at what was ‘new’ out there, we simply wouldn’t get any work done – but we need to be understanding ‘the new’, so we can teach and relate to others who are using ‘all this new stuff’.

The idea for this day came from a really simple idea.  It was to ask colleagues the question, ‘have you heard of any new technology stuff recently?  If so, can you tell me about it?’  Rather than having a hard and fast ‘training’ agenda the idea was to create a space (perhaps a bit like an informal seminar) to allow us to have an opportunity to share views and chat, and to learn from each other.

Cloud computing

After a brief introduction, I kicked off with the first presentation, which was all about cloud computing.  A couple of weeks back, I went to a conference that was all about an open source ‘cloud operating system’ called OpenStack as a part of some work I was doing for a module team.  The key points from the presentation are described in a series of two blog posts (OU Blog)

Towards the end of the presentation, I mentioned a new term called Fog Computing.  This is where ‘the cloud’ is moved to the location where the data is consumed.  This is particularly useful in instances where fast access times are required.  It was interesting to hear that some companies might also be doing something similar.  One example might be companies that deliver pay-on-demand streaming video.  It obviously doesn’t make a lot of sense if the movies that you want to see are located on another continent; your viewing experience may well be compromised by unforeseen network problems and changes in traffic.

It was useful to present this since it helped to clarify some of my understandings, and I also hoped that others found it interesting too.  Whilst the concept of a ‘cloud’ isn’t new (I remember people talking about the magic of an X.25 cloud), the technologies that realise it are pretty new.   I also shared a new term that I had forgotten I had written on one of my slides: the concept of a devop – someone who is also a developer and an operator.

JuxtaLearn project

The second presentation was about the JuxtaLearn project, by Liz Hartnett, who was unable to attend.  Liz, however, still was able to make an impact on the event since she had gone the extra mile to make an MP3 recording of her entire presentation.  Her talk adopted the PechaKucha format.  This is where a presenter uses 20 slides which change every 20 seconds.  Since her slide deck was setup to change automatically, it worked really well.

We learnt about the concept of the threshold concept (which can be connected to the concept of computer programming) and saw how videos could be made with small project groups.  I (personally) connected this with activities that are performed on two different modules: TU100 My Digital Life, and T215 Communication and Information Technologies, which both ask students to make a presentation (or animation).

OU Live and pedagogy

The next talk of the day was by Mandy Honeyman, who also adopted the PechaKucha format.  Mandy talked about a perennial topic, which is the connection between OU Live and pedagogy.  I find this topic really interesting (for the main reason that I haven’t yet ‘nailed’ my OU Live practice within this format, but it’s something that I’m continuing to work on).  I can’t speak for other people, but it has taken me quite a bit of time to feel comfortable ‘teaching’ using OU Live, and I’m always interesting in learning further tips.

Mandy has taken the time and trouble to make a version of her presentation available on YouTube.  So, I’ve you’ve got the time (and you were not at the event), do give this a look.  (She prepared it using PowerPoint, and recorded it using her mobile phone).

The biggest tip that I’ve made a note of is the importance of ‘keeping yourself out of it’, or ‘taking yourself out of it [the OU Live session]’.  When confronted by silence it’s easy to feel compelled to fill it with our own chatter, especially in situations where students are choosing not to use the audio channel.

One really interesting point that came out during the discussion was how important it is to try to show how to use OU Live right at the start of their journey with the OU.  I don’t think this is done as it could be at the moment.  I feel that level 1 tutors are implicitly given the challenging task of getting students up to speed with OU Live, but they will already have a lot on their hands in terms of the academic side of things.  I can’t help think that we could be doing a bit better when it comes to helping students become familiar with what is increasingly become a really important part of OU teaching and learning.

It was also mentioned that application sharing can run quite slowly (especially if you do lots of scrolling) – and one related thought is that this might well impact on the teaching and learning of programming.

A final point that I’ll add is that OU Live can be used in a variety of different way.  One way is to use it to record a mini-lecture, which students can see during their own time.  After they’ve seen them, they can then attend a non-recorded discussion seminar.  I’ve also heard of it being used to facilitate ‘drop in sessions’ over a period of a couple of hours (which I’ve heard is an approach that can work really well).

Two personal reflections that connect to this session include: we always need good clear guidance from the module team about how they expect tutors to use OU Live, and secondly, we should always remember to give tutors permission to use the tool in the ways that make the best use of their skills and abilities, i.e. to say, ‘it’s okay to go ahead and try stuff; this is the only way you can develop your skills’.

The March of the MOOCs

Rodney Buckland, a self-confessed MOOCaholic, gave the final presentation of the morning.  The term MOOC is an abbreviation for Massive Open Online Course.  From the sound of it, Rodney has taken loads.  (Did he really say ‘forty’?  I think he probably did!)

He mentioned some of the most popular platforms.  These include: Coursera, Udacity and FutureLearn (which is a collaboration between the OU and other universities).  Rodney also mentioned a swathe of less well known MOOC platforms, such as NovoEd.   A really interesting link that Rodney mentioned was a site called MOOCList which is described as ‘an aggregator (directory) of Massive Open Online Courses (MOOCs) from different providers’. 

Rodney spoke about his experience of taking a module entitled, ‘Science of the solar system’.  He said that the lecturer had really pushed his students. ‘This was a real surprise to me; this was a real third level physics module’.

A really important point was that MOOCs represented an area that was moving phenomenally quickly.  After his talk had finished there was quite a lot of discussion about a wide range of issues, ranging from the completion rates (which were very low), to the people who studied MOOCs (a good number of them already had degrees), and to the extent to which they can complement university study.  It was certainly thought provoking stuff.

Assistive technology for the visually impaired: past, present and future

The first presentation after lunch was by my colleague Richard Walker.  Richard is a visually impaired tutor who has worked with visually impaired students.  He made the really important point that if an associate lecturer works for an average of about ten years, there is a very significant chance that a tutor will encounter a student who has a visual impairment.  Drawing on his previous presentation, there is an important point that it is fundamentally important to be aware of some of the challenges that visually impaired students can face.

Richard recently interviewed a student who has a visual impairment by email.  Being a persuasive chap, Richard asked me to help out: I read out the role of his student from an interview transcript.  The point (to me) was very clear: students can be faced with a whole range of different issues that we may not be aware of, and everything can take quite a lot longer.

Another part of Richard’s presentation (which connects the present and the future) was all mobile apps.  We were introduced to the colour recogniser app, and another app called EyeMusic (iTunes) which converts a scene to sound.   Another really interesting idea is the concept of the Finger Reader from the Fluid Interface group at MIT.

A really enjoyable part of Richard’s session was when he encouraged everyone to explore the accessibility sessions of their smartphones.  Whilst it was easy to turn the accessibility settings on (so your iPhone spoke to you), it proved to be a lot more difficult to turn the settings off.  For a few minutes, our meeting room was filled with a cacophony of robotic voices that proved to be difficult to silence.

Towards utopia or back to 1984

The penultimate session of the day was facilitated by Jonathan Jewell. Jonathan’s session had a more philosophical tone to it.  I’ve made a note of an opening question which was ‘how right or wrong were we when predicting the future?’

Jonathan referenced the works of Orwell, Thomas More (Wikipedia) and a vision of a dystopian future depicted in THX 1138, George Lucas’s first film.  Other subjects included economic geography (a term that I hadn’t heard before), and the question of whether Moore’s Law (that the number of transistors in a microprocessor doubles every two years) would continue.  On this subject, I have sometimes wondered about what the effect of software design may be if and when Moore’s law fails to continue to hold.

Other interesting points included the concept of the technological singularity and a connection to a recent news item (BBC) where a computer was claimed to have passed the Turing test.

A great phrase was infobesity – that we’re all overloaded with too much information.  This connects to a related phrase that I have heard of before, which is the ‘attention economy’.  Jonathan made a similar point that information is not to much a scare resource.  Instead, we’re limited in terms of what information we can attend to.

We were also given some interesting thoughts which point towards the future.  Everything seems to have become an app: computing is now undeniably mobile.  A final thought I’ve noted down is Jonathan’s quote from security expert, Bruce Schneider: ‘surveillance is the business model of the internet’.  This links to the theme of Big Data (Wikipedia).  Thought provoking stuff!

Limits of Computing

The final talk of the day was by Paul Piwek.  Paul works as a Senior Lecturer in the Department of Computing and Communications at The Open University.  Paul works on a number of module teams, and has played an important role in the development of a new module: M269 Algorithms, Data Structures and Computability.  It is a course that allows students to learn about some of the important fundamentals of computer science.

Paul’s brief was to talk about new technologies – and chose to explore this by considering the important question of ‘what are the limits of computability?’  This question is really important in computer science, since it connects to the related questions: ‘what exactly can we do with computers?’ and ‘what can they actually be used to calculate?’

Paul linked the title of his talk to the work of Alan Turing, specifically an important paper entitled, ‘on computable numbers’.  Paul then went onto talk about the differences between problems and algorithms, introduced the concept of the Turing Machine and spoke about a technique called proof by contradiction.

Some problems can take a long time to be solved.  When it comes to computing, speed is (obviously) really important.  An interesting question is: how might we go faster?  One thought is to look towards the subject of quantum computing (an area that I know nothing about; the page that I’ve linked to causes a bit of intellectual panic!)

Finally, Paul directed us to a Canadian company called DWave that is performing research into the area.

Reflections

After all the presentations had come to an end we all had a brief opportunity to chat.  Topics included location awareness and security, digital forensics, social media, the question of equality and access to the internet.  We could have chatted for a whole lot longer than we did.

It was a fun day, and I really would like to run another ‘new technology day’ at some point (I’ve just got to put my thinking hat on regarding the best dates and times).  I felt that there was a great mix of presentations and I personally really liked the mix of talks about technology and education.  It was a great opportunity to learn about new stuff.

By way of additional information, there is also going to be a London regional ‘research day’ for associate lecturers.  This event is going to take place during the day time on Tuesday 9 September 2014.  This event will be cross-faculty, cross-disciplinary event, so it’s likely that there might be a wide range of different events.  If you would like some more information about all this, don’t hesitate to get in touch, and I’ll point you towards my colleague Katy who is planning this event.

Permalink 1 comment (latest comment by Mandy Honeyman, Tuesday, 17 June 2014, 15:40)
Share post
Christopher Douce

OpenStack conference, June 2014 (part 2 of 2)

Visible to anyone in the world
Edited by Christopher Douce, Saturday, 7 June 2014, 13:21

This blog post is the second of two that summarises an OpenStack conference that I attended on 4 June in London. 

This second half of the conference had two parallel sessions.  Delegates could either go to the stream that was intended for novices (which is what I did), or go to a more technical session. 

I was quite tempted by the technical session, especially by a presentation that was all about what it means to be an OpenStack developer.  One of the key points that I did pick up on was that you need to know the Python language to be an OpenStack developer, which is a language that is used within the OU’s data structures and algorithms module, M269 Algorithms, data structures and computability

Introduction to OpenStack

The first session of the afternoon was by Kevin Jackson who works at Rackspace.

Kevin said that OpenStack and Linux are sometimes spoken about in similar terms.  Both can be created from distributions, and both are supported by companies that can offer consultancy support and help to move products forward. ‘OpenStack is like a pile of nuts’, said Kevin, and the nuts represent different components.

So, what are the nuts?  Nova is a compute engine, which hosts a virtual machine running in a Hypervisor.  I now understand that a hypervisor can host one or more virtual machine.  You might have a web server and your application code running within this bit of OpenStack.

Neutron is all about networking.  In some respects, Neutron is a virtual network that has been all written in code.  There is more about this in later presentations.  If you have different parts of an OpenStack implementation, Neutron allows the different bits to talk to each other; it pretends to be a physical network.

Swift is an object store, which is something that was spoken about during an earlier presentation.  Despite my earlier description, Swift isn’t really like a traditional file system.  Apparently, it can be ‘rack or cabinet aware’, to take account of the design of your physical data centre.

Cinder is another kind of storage; block storage.  As mentioned earlier, to all intents and purposes, Cinder looks like a ‘drive’ to a virtual machine.  I understand a situation where you might have multiple virtual machines accessing the same block storage device.

Ceilometer is a component that was described as telemetry.  This is a block which can apparently say how much bandwidth is being used.  (I don’t know how to describe what ‘bandwidth’ is in this instance – does it relate to the network, the available capacity within a VM, or the whole installation?  This is a distinct gap in my understanding).

Heat is all about orchestration.  Heat monitors ‘the cloud’, or its environment.  Kevin said, ‘if it knows all about your environment and suddenly you have two VMs and not three, it creates a third one’. This orchestration piece was described as a recipe for how your system operates.

All these bits and pieces are controlled by a web interface called Horizon, which I assume makes calls to the APIs of each of these components.  You can use Horizon to look at the components of the network, for example.  I have to confess to being a bit confused about the distinction between JuJu and this standard piece of OpenStack – this is another question that I need to ask myself.

At the end of Kevin’s presentation, I’ve made a note of a question from the floor which was: ‘why should I go open source and not go for a proprietary solution?’  The answer was interesting: you can get locked into a vendor if you choose a proprietary solution.  If you use an open source solution, such as OpenStack you can move your ‘cloud’ different providers, say, from Rackspace to HP.  With Amazon web services, you’re stuck with using Amazon web services.  In some respects, these arguments echo arguments that are given in favour of Linux and other open source products.  The most compelling arguments are, of course, freedom and choice.

A further question was, ‘how mainstream is this going to go?’  The answer was, ‘there’s many companies around the globe who are using OpenStack as a solution’, but I think it was also said that OpenStack is just one of many different solutions that exist.

OpenStack and Storage made easy at Lush Cosmetics

The second presentation of the day was made by Jim Liddle who works for a cloud consultancy called Storage Made Easy.

Jim presented a case study about his work with Lush Cosmetics.  I’ve made note of a number of important requirements: the data that is stored to the cloud should be encrypted, and there should be ways to help facilitate auditing and governance (of the cloud). 

It’s interesting that the subject of governance was explicitly addressed in this case study.  The importance of ‘institutional control’ and the need to carry out checks and balances is one of reasons why organisations might choose private clouds over public clouds. In the case of Lush, two key drivers included the cost per user, and the need to keep their data within the UK.

A new TLA that I heard was OVF (Wikipedia), an abbreviation for Open Virtualization Format, and is a way to package virtual machines in a way that is not tied to any particular hypervisor (VM container), or architecture.  Other technologies and terms that were referred to included: MySQL, which is touched on in TT284 Web Technologies (OU), Apache, MemCached (Wikipedia) and CentOS.

Deploying Windows Workloads into OpenStack using JuJu

A lot of the presentations had a strong Linux flavour to them.  Linux, of course, isn’t the only platform that can be used to power clouds. Alessandro Pilotti from Cloudbase solutions spoke on the connections between Windows and OpenStack.

Terms that cropped up during his presentation included Hyper-V (a hypervisor from Microsoft), KVM (Kernel based virtual machine, which is Linux hypervisor), MaaS (metal as a service, an Ubuntu term), GRE Tunnels (GRE being an abbreviation for Generic Routing Encapsulation), NVGRE (Network Virtualization using Generic Routing Encapsulation), and RDP (Remote Desktop Protocol).

It was all pretty complicated, and even though I’m reasonably technical, this was at a whole other level of detail.  Clicking through some of the above links soon takes me into a world of networking and products that are pretty new to me.  This clearly suggests that there is a whole lot of ‘new technology’ out there that I need to try to make a bit of sense of, and this, of course, takes time.

Alessandro also treated us to a live video link that showed a set of four small computers that were all hooked up together (I have no idea what these small desktop computers without screens were called; they used to have a special name).  The idea was to show LEDs flashing to demonstrate some remote rebooting going on.

This demo didn’t quite work out as planned, but it did get me thinking: to really learn how to do cloud stuff, a good idea would be to spend time actually playing with bits of physical hardware. This way you can understand the relationships between logical and physical architectures.  The challenge, of course, is finding the time to get the kit together, and to do the learning.

Using Swift in Entertaining Ways

This presentation was made by a number of people from Sohonet a company that offers cloud services to the film and TV industry.  An interesting application of cloud computing is film and video post-production, the part of production where when recordings are digitally edited and manipulated. An interesting challenge is that when it comes to video post-production we’re talking about huge quantities of data, and data that needs to be kept safe and secure.

Sohonet operates two clusters that are geographically separate.  Data needs to be held over different timescales, i.e. short, medium and long-term, depending upon the needs of a particular project.

A number of interesting products and companies were mentioned during this talk.  These include Expandrive where an OpenStack Swift component can become a network drive.  Panzura was mentioned in terms of Swift as a network appliance.  Zmanda and Cloudberrylab were all about backup and recovery.  Interesting stuff; again, a lot to take in.

Bridges and Tunnels – a drive through OpenStack networking

Mark McClain from the OpenStack foundation, talked about the networking side of things, specifically, the OpenStack networking component that is called Neutron.  Even though I didn’t understand all of it, I really enjoyed this presentation.  On a technical level, it was very dense; it contained a lot of detail.

Mark spoke about some of the challenges of using the cloud.  These included a high density of servers, the difficulties of scaling and the need for on-demand services.  A way to tackle some of these challenges is to use network virtualisation and something called overlay tunnelling (but I’m not quite sure what that means!)

Not only can virtual machines talk to virtual drives (such as the block storage service, Cinder), but they can also talk to a virtual network.  The design goals of the network component were to have a small core, and to have a pluggable open architecture which is configurable and extensible.  You can have DHCP configuration agents and can specify network traffic rules.  Neutron is also (apparently) backed by a database and a message queue.  (I also heard that there is a REST interface, if I’ve understood it correctly and my notes haven’t been mangled in the rush to write everything down).

A lot of network hardware can now be encoded within software (which links back nicely to the point about abstraction that I mentioned in the first block).  One example is something called Openvswitch (project website).  I’ve also noted down that you can have a load balancer as a service, a VPN as a service and a firewall as a service (as highlighted by the earlier vArmour talk).

Hybrid cloud workloads

The final talk of the day was by Monty Taylor who is also from the OpenStack foundation.  A hybrid cloud is a cloud that is a combination of public and private clouds (which could, arguably be termed an ‘ecosystem of clouds’).  Since it was the end of the day, my brain was full, and I was unable to take a lot more on board.

Reflections

All this was pretty interesting and overwhelming stuff.  I remember one delegate saying, ‘this is all very good, but it’s all those stupid names that confuse me’.  I certainly understand where he was coming from, but when it comes to talking about technical stuff, the names are pretty important: they allow developers to share understandings.  I’m thankful for those names, although each name does take quite a bit of remembering.

One of the first things I did after the conference was to go look on YouTube.  I thought, ‘there’s got to be some videos that helps me to get a bit more of an understanding of everything’, and I wasn’t to be disappointed – there are loads.  Moving forward, I need to find some time to look through some of these.

One of the things that I’ll be looking for (and something that I would have liked to see in the conference) was a little bit more detail about case studies that explicitly show how parts of the architecture work.  We were told that virtual machines can spin up in situations where we need to attend to more demand, but perhaps the detail of the case studies or explanations passed me by.

This is a really important point.  Some aspects of software development are changing.  I’ve always held the view that good software developers need to have an appreciation of system administration (or the ‘operations’ side of things).  When I had a job in industry there was always a separation between the systems administrators and the developers.  When the developers are done, they throw software over the wall to the admins who deploy the software.

This conference introduced me to a new term: a devop – part developer, part programmer.  Devops need to know systems stuff and programming stuff.  This is a reflection of software being used at new levels of abstraction: we now have concepts such as network as a service, and software defined security.  Cloud developers (and those who are responsible for keeping clouds running) are system software developers, but they can also be (and have to understand) application development too. 

A devop needs a very wide skill set: they need to know about networks, hardware, operating systems, and different types of data store.  They might also need to know about a range of different scripting languages, and other languages such as Python.  All these skills take time (and effort) to acquire.  A devop is a tough and challenging job, not only due to the inherent complexity of different components, but also due to the speed that technologies change and evolve.

When I arrived at the conference, I knew next to nothing about what OpenStack was all about, and who was using it.  By the end of the conference I ended up knowing the names of some of its really important components; mists of confusion had started to lift.  There is, however, a huge amount of detail to get my head around, and one of the things that I’m also going to do is to look at some user stories (OpenStack foundation).  This, I think, will help to consolidate some of my learning.

Permalink Add your comment
Share post
Christopher Douce

OpenStack conference, June 2014 (part 1 of 2)

Visible to anyone in the world
Edited by Christopher Douce, Friday, 6 June 2014, 17:43

On 4 June, I went to an event that was all about something called OpenStack.  OpenStack is an open source software framework that is used to create cloud computing systems.  The main purpose of this blog is to share my notes with some of my colleagues, but also to some of the people who I met during the conference.  Plus, it might well be of interest to others too.

Cloud computing is, as far as I understand it, a broad terms that relates to the consumption and use of computing resources over a network.  There are a couple of different types of cloud: there are public clouds (which are run by large companies such as Amazon and Google), private clouds (which are run by a single organisation), and hybrid clouds (which is a combination of public and private clouds).  There’s also the concept of a community cloud - this is where different organisations come together and share a cloud, or resources that are delivered through a cloud.

This is all very well, but what kind of computing resources are we talking about?  As far as I know, there are a couple.  There’s software as a service (or SaaS).  There’s PaaS, meaning, Platform as a Service, and there’s IaaS, which is Infrastructure as a Service.  Software as a Service is where you offer software through a web page, and you don’t ever touch the application code underneath.  Infrastructure as a Service is where you might be able to manage a series of ‘computers’ or servers remotely though the cloud.  More often than not, these computers are running in something called virtual machines.

These concepts were pretty much prerequisites for understanding what on earth everyone was talking about during the day.  I also picked up on a whole bunch of new terms that were new to me, and I’ll mention these as I go.

Opening Keynote : The OpenStack Foundation

Mark Collier opened the conference.  Mark works for the OpenStack Foundation (OpenStack website). During his keynote he introduced us some of the parts that make up OpenStack (a storage part, a compute part and a networking part), and said that there is a new software release every six months.  To date there are in the order of approximately 1.2k developers.  The community was said to comprise of approximately 350 companies (such as RedHat, IBM, HP, RackSpace) and 16k individual members.

Mark asked the question: ‘what are we trying to solve?’  He then went onto quote Mark Andreessen who said, ‘software is eating the world’.  Software, Mark said, is said to be transforming the economy and disrupting industries. 

One of the most important tools in computer science is abstraction.  OpenStack represents a way to create a software defined data centre (a whole new level of abstraction), which allows you to engineer flexibility to enable organisations to move faster and software systems to scale more quickly.

Mark mentioned a range of different companies who are using OpenStack.  These could be considered to be superusers (and there’s a corresponding superuser page on the OpenStack website which presents a range of different case studies).  Superusers include organisations such as Sony, Disney and Bloomberg, for example.

I remember that Mark said that OpenStack is a combination of open source software and cloud computing.  Another link that I noted down was to something called the OpenStack marketplace (OpenStack website).  Looking on this website shows a whole range of different Cloud distributions (many of which come from companies that offer Linux distributions).

Keynote: Canonical, Ubuntu and OpenStack

Mark Shuttleworth from Canonical (Canonical website) offered an industry perspective.  Canonical develops and supports Ubuntu which is a widely used Linux distribution.  (It is used, as far as I can remember in the TM129 Technologies in Practice module).  As well as running on the desktop, Ubuntu is widely used on the server side, running within data centres.  A statistic I’ve noted down is that Ubuntu accounts for ‘70% of guest workloads’.  What this means is that we’re talking about instances of the Linux operating system that have been configured and packaged by Ubuntu (that are running on a server within a datacentre, somewhere).

A competitor to Ubuntu is another Linux distribution called CentOS.  There is, of course, also Microsoft Windows Server.  When you use public cloud networks, such as those provided by Amazon, I understand that you’re offered a choice of the operating system that you want to ‘host’ or run.

An interesting quote is, ‘building your cloud is a bit like building your own mainframe – users will always want it to be working’.  We also heard of something called OpenStack Interoperability Laboratory.  Clouds can be built hundreds of times a day, we were told – with different combinations of technology from different vendors.  ‘Iteration is the only way to understand the optimal architecture for your use case’.

A really important aspect of cloud computing is the way that a configuration can dynamically adapt to changing circumstances (and user demands).  The term for how this is achieved (in the cloud computing world) seems to be ‘orchestration’.  In OpenStack, there is a tool called JuJu (Wikipedia).  JuJu enables (through a dashboard interface) different combinations of components to be defined.  There is a concept of a ‘charm’ (which was described as scripts which contain some operational coding).  If you would like to look at what it is all about, there’s a website called JuJu Charms that I’ve yet to spend time exploring.

I’ve also noted down something called a Service Orchestration Framework, which lets you place services where you want, and on what services.  There are some reference installations for certain types of cloud installations (which reminds me of the idea of ‘design patterns’ in software).

Mark referred to a range of different technologies during his talk, some of which I had only very briefly heard of.  One technology that was referred to time and time again was the concept of the hypervisor (Wikipedia).  I understand this to be a container (either hardware or software) that runs one or more virtual machines.  Other terms that he mentioned or introduced include KVM (Kernel-based virtual machine), Ceph (a way to offer shared storage), and MaaS, or Metal as a Service (Ubuntu), which ‘brings the language of the cloud to physical servers’.

A further bunch of mind boggling technical terms that were mentioned include ‘lightweight hyppervisors’ such as LXC (LinuX Containers), Hadoop, which is a data storage framework, and TOSCA (Wikipedia), which is an abbreviation for Topology and Orchestration Specification for Cloud Applications.  In terms of databases, some new (and NoSQL) technologies that were mentioned included MongoDB and Cassandra.

At this point, it struck me how much technologies have changed in such an incredibly short time, reminding me that we live in interesting times.

Keynote: Agile infrastructure built in OpenStack

The second keynote of the day was by John Griffith, Project Technical Lead, SolidFire.  John’s presentation had the compelling subtitle: ‘building the next generation data centre with OpenStack’.

A lot of people started using Amazon, who I understand to be the most successful public cloud provider, to use IT resources more efficiently.  There are, of course, other providers such as Google compute engine (Google), Windows Azure (Microsoft), and SoftLayer (which appears to be an IBM company).

A number of years ago, at an OU postgrad event, I overheard a discussion between two IT professionals that began with the question, ‘so, what are the latest developments in servers?’  The reply was something about server consolidation: putting multiple services on a single machine, so you can use that one machine (a physical computer or server) more efficiently.  This could be achieved by using virtual machines, but you can only do so much with virtual machines.  What happens if you run out of processing power?  You need to either get a faster machine, or move one of your virtual machines to another machine that might be under-utilised.

The next generation data centre will be multi-tenant (which means multiple customers or organisations using the same hardware), have mixed workloads (I don't really know what this means), and have shared infrastructure.  A key aspect is that an infrastructure can become software defined, as opposed to hardware defined, and the capacity of a cloud configuration or setup can change depending upon local demand.

There were a number of attributes of cloud systems.  I think there were: agility, predictability, scalability and automation.

In the cloud world applications can span many virtual machines, and data can be stored in scalable databases that are structured in many tiers.  The components (that make up a cloud installation) can be configured and managed through sets of predefined interfaces (or APIs).  I also made a note of a mobile app that can be used to manage certain OpenStack clouds.  One example of this is the Cloud mobile app from Rackspace.

Another interesting quote was, ‘[the] datacentre is one big computer and OpenStack is the operating system’.  Combining servers together has potential benefits in terms of power consumption, cooling and the server footprint.

One thing that developers need to bear in mind is how to create applications.  Another point was: consider scalability and plan for failure.  A big challenge lies with uncovering and deciphering what all the options are.  Should you use, for example, block storage services, or object storage?  What are the relative advantages and disadvantages of each?

Parts of this presentation started to demystify some of the terms that have baffled me from the start.  Cinder was, for example, is OpenStack’s block storage.  Looking outwards from the operating system, a block storage device could be a hard disk, or a USB drive.  Cinder, in effect, mimics what a hard drive looks at, and you can store stuff to a Cinder service as if it was a disk drive.  Swift is an object database where you can store object.  So, you might think of it in terms of sets of directories, the contents of which are replicated over different hard drives to ensure resilience and redundancy.

There is a difference between a service that is an abstraction to store and work with data, and how physical data is actually stored.  To make these components work with actual devices, there are a range of different plug-ins.

Presentation: vArmour

I have to admit that I found this presentation thoroughly baffling.  I had no idea what was being presented until I finally picked up on the word ‘firewall’, and the penny dropped: if a system architecture is defined in software, the notion of a firewall as a physical device suddenly becomes very old fashioned, if not a little bit quaint.

In the cloud world, it’s possible to have something a ‘software firewall’.  A term that I noted down was ‘software defined security’.  Through SDS, you can define what traffic is permissible between nodes and what isn’t, but in the ‘real world’ of physical servers, I’m assuming that physical ‘top layer’ firewalls are important too.

I also came across two new terms (or metaphors) that seem to make a bit of sense in the ‘cloud world’.  Data could, for example, move in a north-south direction, meaning it goes up and down through various layers.  If you’ve got east-west movement of data, it means you’re dealing with a situation where you might have a number of different virtual machines (that might have been created to respond to end user demand), which may share data between each other.  The question is: how do you maintain security when the nature of a configuration might dynamically change? 

Another dimension to security which crossed my mind was the need for auditability and disaster recovery, and both were subjects that were touched upon by other presenters.

In essence, I understood vArmour to be a commercial software defined security product that works akin to a firewall that can be used within a cloud system.

Presentation: The search for the cloud’s ‘God Particle’

Chris Jackson, who works for Rackspace (a company which has the tagline ‘the open cloud company’), gave the final presentation before we all broke for lunch.  Chris confessed to being a physicist (as well as a geek) and referred to research at CERN to find ‘the God particle’.  I also seem to remember him mentioning that CloudStack was used by CERN; there’s an interesting superuser case study (OpenStack website), for those who might be interested.

Here’s the question: if there is a theory that can describe the nature of matter, is there a theory that might explain why a cloud solution might not be adopted?  (He admitted that this was a bit of fun!)  He presented three different theories and asked us to vote on which were, perhaps, the most significant.

The first was: application.  Some applications can be rather fragile, and might need a lot of cosseting, whereas other forms of application might be very robust; they’re all different.  Cloud applications, it is argued, embrace chaos and build failure into applications.  Perhaps the precise character of certain applications might not lend it to being a cloud application?

Theory two: integration.  There could be the challenge of integration and connection with existing systems, which might themselves have different characteristics. 

The third theory is all about operations.  This is more about the culture of an organisation.

So, which theory is the reason why organisations don’t adopt a cloud solution?  The answer is: quite possibly all of them.

Permalink Add your comment
Share post
Christopher Douce

Disabled student services conference 2014 – day 2

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 22 Jan 2019, 09:39

Keynote: positive thinking

The first keynote of the day was by motivational speaker, David Hodgson.  The title of his session was, ‘the four key ways to happiness and success’ (which was a really very ambitious title, if you ask me!)  I’ve seen David talk before at a staff development day in London, where he encouraged us to reflect upon our Myers-Briggs personality profile.  Apparently, this was the focus of a later workshop that he ran later during the morning.

So, what are the four key ways?  Thankfully, I was sufficiently caffeinated to be able to take a note of them.  They are: (1) know yourself (and the great things that you’re capable of), (2) having self-belief, (3) have a plan (of some kind), and (4) have a growth attitude.   Of course, I’m paraphrasing, but, all in all, these are pretty good points to think about.

David also presented us with a quote from Abraham Maslow, who proposed his eponymous Hierarchy of Needs (Wikipedia).  The quote goes:  ‘If you plan on being anything less than you are capable of being, you will probably be unhappy all the days of your life.’  Maslow might have accompanied that quote with a wagging finger and the words, ‘you really need to sort yourself out’.  I had these words rattling around my head for next two days.

Workshop: Learning design for accessibility

The first workshop of the day was facilitated by Lisette Toetenel and Annie Bryan from the OU’s Institute of Educational Technology.  The focus of the event was a learning design tool that IET had created to help module teams consider different pedagogic approaches.  It has been embedded into the module design process, which means that module chairs have to signify that they’ve engaged with IET’s learning design framework.  Through my involvement with a new module, I had heard a little about it, but I didn’t know the detail.

Learning design was defined as, ‘the practice of planning, sequencing and managing learning activities’, usually using ICT tools to support both design and delivery.  An important point was that both accessibility and important areas such as employability skills need to be considered from the outset (or be ‘woven into’ a design) and certainly not ‘bolted on’ as an after-thought.

The learning design framework is embedded into a tool, which takes the form of a template that either module members or a module chair has to complete.  Its objective is to improve quality, sharing of good practice, speed up decision making process, and manage (and plan) student workload.  The tool has an accompanying Learning Design website  (but you might have to be a member of the university to view this).

During the workshop we were divided up into different tables and asked to read through a scenario.  Our table was given an ‘environmental sciences’ scenario.  We were asked three questions: ‘what exactly do students do [in the scenario], and how do (or might) they spend their time?’ and what accessibility problems they might be confronted with.

The point was clear: it’s important to consider potential barriers to learning as early and as soon as you can.

Keynote: SpLDs – The Elephant in the Counselling Room: recognising dyspraxia in adults

The final keynote of the conference was given by Maxine Roper (personal website).  Maxine describes herself as freelance journalist and writer, and a member of the dyspraxia foundation.

One of the main themes of her keynote was the relationship between dyspraxia and mental health.  Now, I’ll be the first to say that I don’t know very much about dyspraxia.  Here’s what I’ve found on the Dyspraxia Foundation website: ‘Dyspraxia, a form of developmental coordination disorder (DCD) is a common disorder affecting fine and/or gross motor coordination, in children and adults. … dyspraxia [can also refer] to those people who have additional problems planning, organising and carrying out movements in the right order in everyday situations.’

I was struck by how honest and personal Maxine’s talk was.  Dyspraxis is, of course, a hidden disability.  Maxine said that dyspraxics are good at hiding their difficulities and their differences, and spoke at length about the psychological impact.  An interesting statistic is that ‘a dyspraxic child is 4 times more likely to develop significant psychological problems by the age of 16’ (from the Dyspraxia Foundation).

Some of the effects can include seeing other people more capable, being ‘over givers’ with a view to maintaining friendships, but other people might go the other way and become unnecessarily aggressive (as a strategy to covering up ‘difference’).  Sometimes people may get reactive depression in response to the continual challenge of coping.

I found Maxine’s description of the psychological impact of having a hidden disability fascinating – it is a subject that I could very easily relate to because I also have a hidden disability (and one that I have also tried a long time to hide).  This made me ask myself an obvious question that might well have an obvious answer, which was ‘are these thoughts, and the psychological impact common across other types of hidden disabilities?’ 

So, what might the solutions be?  Maxine offered a number of answers: one solution could be to raise awareness.  This would mean awareness amongst students, and amongst student councillors and those who offer support and guidance.

I noted down another sentence that was really interesting and important, and this was the point about coping strategies.  People develop coping strategies to get by, but these coping strategies might not necessarily be the most appropriate or best approach to adopt.  In some cases it might necessary to unpick layers of accumulated strategies to move forward, and doing this has the potential to be really tough.

Maxine’s presentation contained a lot of points, and one of the key one for me (the elephant in the room), was that it’s important to always deal with the person as a whole, and that perhaps there might be (sometimes) other reasons why students might be struggling.

Workshop : Through new eyes – understanding the experience of blind and partially sighted learners

The final workshop of the conference was given by my colleague Richard Walker, who works as an associate lecturer for the Maths Computing and Technology Faculty.  Like Maxine’s keynote, Richard’s spoke from his own experience, and I found his story and descriptions compelling and insightful.

Richard told us that he had worked with a number of blind and partially sighted students over the years.  He challenged us with an interesting statistic: if we consider the number of people in the general population who have visual impairments, and if an associate lecturer tutors a subject for around ten or so years, this means there is a 90% chance that a tutor will encounter a student who has a visual impairment.  The message is clear: we need to be thinking about how to support our students, which also means how we need to support our associate lecturers too.

Richard has had a stroke which has affected his vision.  Overnight, he became a partially sighted tutor.  ‘This changed how I saw the world’, he said. 

One of his comments has clearly stuck in my mind.  Richard said that when he was in hospital he immediately wanted to get back to work.  Richard later started a blog to document and share his experiences, and I’ve also made a note of him saying that he ‘couldn’t wait to start my new career’, and ‘when I got home from hospital I wanted to download some software so I can continue to be an Open University tutor’.

Richard spoke about the human visual system, which was fascinating stuff, where he talked about the working of the eye and our peripheral vision.  He presented simulations of different visual impairments though a series of carefully drawn PowerPoint slides.    On the subject of PowerPoint, he also spoke briefly about how to make PowerPoint accessible.  His tips were: keep bullet points very short, choose background and foreground colours that have a good contrast, and ensure that you have figure descriptions.

I was struck by Richard’s can-do attitude (and I’m sure others were too).  He said, ‘the whole world looks a bit different, and I like learning new stuff, so I learnt it’.  An implication of becoming partially sighted was that this affected his ability it read.  It was a skill that had to be re-learnt or re-discovered, which sounds like a pretty significant feat.  ‘I just kept looking at the lines, and I’ve learnt to read again.  You just experiment [with how to move your eyes] and you see what works’.

When faced the change in his vision, he contacted his staff tutor for advice, and some accommodations were put in place.  Another point that stood out for me was the importance of trust; his line manager clearly trusted Richard’s judgement about what he could and could not do.

Sharing experience

Richard tutors on a module called M250 Object-oriented programming (OU website).  When student study M250 they write small programs using a software development environment.    Richard made the observation that some software development environments can be ‘hostile to assistive technology’, such as screen readers.

Richard is currently tutoring a student who has a visual impairment.  To learn more about the student’s experience, he interviewed the student by email – this led to creation of a ‘script’.  With help from a workshop delegate, Richard re-enacted his interview, where he asked about challenges, assistive technologies, study strategies and what could be done to improve things. We learnt about the use of Daisy talking books (Wikipedia), the fact that everything takes longer, about strategies for interactive with computers, and the design of ‘dead tree’ books that could be read using a scanner.  After the performance, we were set an activity to share views about what we learnt from the interview (if I remember correctly).

Towards the end of the workshop, Richard facilitated a short discussion about new forms of assistive technologies and ubiquitous computing, and how devices such as Google Glass might be useful; thought provoking stuff.

I enjoyed Richard’s session; it was delivered with an infectious enthusiasm and a personal perspective.  The final words that I’ve noted down in my notebook are: ‘it’s not because I’ve got a strength of character, it’s because I love my work … you just have got to get on with it’

Reflections

Like all the others, this year’s disabled student services conference was both useful and enjoyable.  These events represent an invaluable opportunity to learn new things, to network with colleagues, and to take time out from the day job to reflect on the challenges that learner’s face (and what we might be able to do to make things easier).

For me, there were a couple of highlights.  The first was Keith’s understated but utterly engaging keynote.  The second was Richard Walkers workshop: I had never seen Richard ‘in action’ before, and he did a great job of facilitation.  In terms of learning, I learnt a lot from Maxine’s talk, and it was really interesting to reflect upon the emotional and psychological impact that a hidden disability can have on someone.  I feel it’s an issue that is easily overlooked, and is something that I’ll continue to mull over.  In some respects, it has emphasised, to me, how demanding and important the role of learning support advisors role is to the university.

One question that I have asked myself is: ‘what else could be done within the conference?’  This, I think, is a pretty difficult question to ask, since everything was organised very well, and the whole event was very well attended.

One thought is about drama.  Richard’s session contained a hint of drama, where he used a fellow delegate to read a script of his email interview.  I’ve attended a number of excellent sessions in the East Grinstead region (which is now, sadly, going to be closed) that made use of ‘forum theatre’.  Perhaps this is an approach that could be used to allow us to expose issues and question our own understandings of the needs of our students.  Much food for thought.

Permalink 1 comment (latest comment by Jonathan Vernon, Wednesday, 28 May 2014, 14:13)
Share post
Christopher Douce

Disabled student services conference 2014 – day 1

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 22 Jan 2019, 09:40

I recently attended the university’s disabled student services conference held between 13 and 14 May 2014.  I think this was the third time I’ve been to this event, and every time I go I always learn something new.

This is a quick blog summary of the sessions I attended.  I guess this summary serves a number of purposes.  Firstly, it’s a summary of some of the continuing professional development I’ve been getting up to this year.  Secondly, it might be of interest to any of my students who might be studying H810 accessible e-learning (OU website).  Thirdly, it might be useful to some of my colleagues, or for anyone who accidentally stumbles across this series of two posts.

The complexities of co-occurrence

The first session of the day was presented by my colleague Jonathan Jewell, who works as an associate lecturer for a least three different faculties.  My first thought was, ‘what is meant by co-occurrence?’ - it wasn’t a term I had heard before.  I quickly figured out that it means that a person can have a number of different conditions at the same time.  A big part of his session was about what this might mean in terms of understanding a profile that contains quite a lot of information.

During Jonathan’s session I remember a debate about the terms ‘student-centred’ and ‘person-centred’.  The point was that although a student might be studying a particular module, they are on a programme, and this can, of course relate to a broader set of personal objectives that they might hold.

Every student who discloses a disability may have their own disability profile. The aim of the profile is tell a tutor something about their students to help them to understand what adjustments (in terms of their tuition) they could make.

During Jonathan’s session we looked at a sample profile and thought about it in terms of its strengths and weaknesses.  Our group concluded that the profile we were given contained a lot of information.  A particular weakness was that it contained a lot of quite technical jargon that was quite hard to understand.  A later task was to devise a ‘tutor plan of action’ based on the profile.  A clear point that was mentioned was the importance of establishing early contact with students to ensure that they feel comfortable and supported.

Towards the end of the session, I remember a debate that student profiles can change; some disabilities are temporary.  I also understand that there are now clearer university guidelines about how profiles should be written; a profile written today might be different to how it was written a couple of years ago.

Keynote: REAL services to assist students who identify with Asperger syndrome (AS)

The first keynote of the day was by Nichola Martin who I understand works for the University of Cambridge.  The ‘REAL’ bit of her presentation title is an abbreviation for: reliable, empathic, anticipatory and logical – this idea is that we should be these attributes when we work with people who identify with having Asperger syndrome (AS).  Very early on during her presentation she made the key point that ‘if you’ve met one person with AS, you’ve only met one person with AS’. 

Nichola also exposed us to stereotypes from the media, which she asked us to question.  The use of language is fundamentally important too, i.e. the term ‘condition’ is better than ‘disorder’ which suggests that something is fundamentally wrong.  Another interesting point is that the characteristics of people can change over time, a point that neatly connects back to the previous session about the changing nature of student profiles.

A big part of Nichola’s presentation was to share some findings from a research project that studied the views of students.  Its aim was to develop a model of best practice for student with AS, improve access to diagnosis, raise awareness and develop networks.

One really important point is about the importance of clear language; always be clear in what you either say or write.  An important point that I have noted is that if we make accommodations for one group, this is likely to help all students.  Stating clear assumptions in a clear and respectful way is, of course, useful for everyone. 

Another point is that institutions can be difficult to negotiate, particularly during the early stage of study.  If things are chaotic at the beginning of university study, it might be difficult to get back onto an even keel.  Some challenges that students might face can include finding their way through new social environments.  I’ve noted down a quote which goes, ‘my main barriers have been social and I find large groups of people I don’t know intimidating – as a result, I rarely attend lectures and often feel alone’.

There were some really interesting points about disability and identity which deserve further reflection.  Some students choose not to disclose and don’t go anywhere near the disability services part of the university.  Students may not want ‘special services’, since this hints at the notion of ‘othering’, or the emphasis of difference.  If people don’t want to talk about their personal circumstances, that is entirely their right.

We were told that Asperger’s and autism are terms that are used interchangeably, and this is reflected in the most recent publication of the DSM (Wikipedia, Diagnostic and Statistical Manual of Mental Disorders).

There were a number of things that were new to me, such as The Autism Act 2009 (National Autistic Society), and The Autism Strategy 2010 (National Autistic Society), which has been recently updated.  Another interesting and useful link is a video interview produced by the National Autistic Society (YouTube).   It was also great to hear that Nichola also mentioned OU module SK124 understanding the autism spectrum (OU website). 

All in all, a thought provoking talk.

Workshop: Student Support Teams and Disabled Students Support

The next event I went to was a workshop where different members of the newly formed student support teams (SSTs) were brought together to discuss the challenges of supporting students who have disabilities.  Again, the subject of student profiles was also discussed.

My own perspective (regarding student support teams) is one that has been really positive.  Whenever I’ve come across an issue when I needed to help a student (or a tutor) with a particular problem, I’ve always been able to speak with a learning support advisor who have always been unstintingly helpful.  I personally feel that now there are more people who I can speak to regarding advice and guidance.

Keynote: The life of a mouth artist

The final keynote of the day was a really enjoyable and insightful talk by artist, Keith Jansz.  Keith began by telling us about his background.  After being involved in a car accident, in which he was significantly paralysed, he started to learn how to draw and paint after being given a book about mouth artists by his mother in law. 

Keith spoke how he learnt how to paint, describing the process that he went through.  Being someone who has a low opinion of my own abilities when it comes to using a pencil, I found his story fascinating.  I enjoyed Keith’s descriptions of light, colour, and the creative process. What struck me were the links between creativity, learning and self-expression; all dimensions that are inextricably intertwined. 

I thought his talk was a perfect keynote for this conference.   It was only afterwards that the implicit connections between Keith’s talk and the connections with university study became apparent. Learning, whatever form it may take, can be both life changing and life affirming.

During the conference, there was an accompanying exhibition of Keith’s work.  You can also view a number of his paintings on his website.

Permalink Add your comment
Share post
Christopher Douce

Ten Forum Tips

Visible to anyone in the world

I spend quite a lot of time using on-line discussion forums that are used as a part of a number of Open University modules I have a connection with.  I also wear a number of different ‘hats’; as well as being an Open University tutor, I also spend time visiting forums that are run by other Open University tutors in my role as a staff tutor.

A couple of years ago, I was sent a copy of a book called e-moderating (book website) by Gilly Salmon, who used to work at the Open University business school.  The e-moderating book is really useful in situations where the discussion forums constitute a very central part of the teaching and learning experience.  Salmon offers a raft of useful tips and offers us a helpful five stage model (which can be used to understand the different types of interaction and activities that can take place within a forum).

Different modules use discussion forums in different ways.  In some modules, such as H810 Accessible on-line learning they are absolutely central to the module experience.  In other modules, say, M364 Interaction Design, they tend to adopt more of a ‘supporting’ rather than a ‘knowledge creation’ role.

Just before breaking for Christmas and the New Year, I started (quite randomly) to write a list of what I thought would be my own ‘killer tips’ to help tutors with forums.  This is what I’ve come up with so far.

1. Be overly polite

One phrase that I really like is ‘emotional bandwidth’.  In a discussion forum, we’re usually dealing with raw text (although we can pepper our posts with emoticons and pictures). 

This means that we have quite a ‘narrow’ or ‘low’ emotional bandwidth; our words and phrases can be very easily misunderstood and misinterpreted by others, especially in situations when we’re asking questions with the objective of trying to learn some new concept or idea.  Since our words are always ambiguous, it’s important to be overly polite.   

Be more polite than you would be in real life!

2. Acknowledge every introduction

The start of a module is really important.  The first days or weeks represent our chance to ‘set the tone’.  If we set the right tone, it’s possible to create momentum, to allow our discussion forums to attract interaction and conversation.

A good idea at the start of a module is to begin an ‘introduction’ thread.  Start this thread by posting your own introduction: set an example.  When other introductions are posted, take the time to send a reply to (or acknowledge) each one.

3. Use pins

Some discussion forums have a feature that allows you to ‘pin’ a discussion thread to the top of a forum. 

The act of ‘pinning’ a thread highlights it as something that is important.  Pins can be really useful to highlight discussions that are current or important (such as an activity that needs to be completed to prepare for an assignment, for example).  Subsequently, it’s important not to pin everything.  If you do, students will be unclear about what is important and what is not and this risk hiding important discussions. 

Use ‘pinned threads’ in a judicious way and regularly change what threads are pinned (as a module progresses) – this suggests that a forum is alive and active.

4. Tell your students to subscribe

There are a couple of ways to check the OU discussion forums.  One way is to login regularly and see whether anyone has made any new posts.  Another way is to receive email updates, either from individual threads or from whole discussion forums.  At the start of a module presentation, it’s a good idea to tell your student group to subscribe to all the forums that are used within a module.  One way to do this is by sending a group email.  When a student has subscribed they are sent an email whenever anyone makes a post or sends a reply (the email also contains a copy of the message that was posted).

One of the really good things about using emails to keep track of forums is that it’s possible to set up ‘rules’ on an email client.  For example, whenever a forum related email is received, it might be possible to transfer it to a folder based on the module code that is contained within the message subject.  This way, you can keep on top of things without overloading your inbox.

5. Encourage and confirm

Busy forums are likely to be the best forums.  One approach to try to create a busy forum is to do your best to offer continual encouragement; acknowledge a good post and emphasise key points that have been raised.  (Salmon writes about weaving together and summarising a number of different discussions). 

Another really great thing to do is to seek further confirmation or clarifications.  You might respond to a message by writing something like, ‘does this answer your question?’  This keeps a discussion alive and offers participants an opportunity to present alternative or different perspectives.

6. Push information about TMAs

Tutor marked assignments (TMAs) are really important.  As soon as a TMA is submitted, students will generally expect them back within 10 working days (which is the university guideline).  Sometimes TMAs are returned earlier, and in some situations (with permission from the staff tutor), it can take a bit longer.  A forum can be used to provide ‘push’ updates to students about how marking is progressing.  Once a TMA cut-off date has been met, a tutor could start a forum thread entitled, ‘TMA x marking update’. 

When you’re approximately half way through the marking, one idea is to make a post to this thread telling them.  Also, if your students have subscribed, they’ll automatically receive the updates.  This reduces pre-TMA result anxiety (for the students), since everyone is kept in the loop about what is happening.  (The thread can also be used to post some general feedback, if this is something that is recommended by the module team).

7. Advertise tutorials

Open University tutorials can be either face to face (at a study centre, which might be at a local university or college), or can take place on line through a system called OU Live.  A post to a discussion forum can be used to remind students about tutorials.   They can also be used to offer some guidance to students to help them to prepare for the session.  You could also ask whether students (in your group) have any particular subjects or topics that they would like to be discussed or explored.

After the tutorial, a forum can be used to share handouts that were used during either an on-line session or a day school.  It also offers students an opportunity to have a discussion about any issues that (perhaps) were not fully understood.  Also, during a tutorial, a tutor might set up or suggest a long running research task.

There are a number of advantages of connecting tutorials to forum discussions.  Those people who could not attend can benefit from any resources that were used during an event.  It also allows a wider set of opinions and views to be elicited from a greater number of students.

8. Provide links

A subject or topic doesn’t begin and end with the module materials.  During the presentation of a module, you might inadvertently see a TV programme that addresses some of the themes that are connected to a particular topic of study.  A forum is a great way to contextualise a module by connecting it to current stories in the media and one way you can do this is by either posting links to a news story (or series of stories), or perhaps by starting a discussion.

As well as sharing news stories, you can also use discussion forums to alert students to some of the study skills resources that have been developed by the Open University.  There are also some library resources that might be useful too.  Other resources might include OpenLearn resources, for example.  A forum is a great way to direct students to a wide array of useful and helpful materials.  You might also want to ask other tutors (using the tutors’ forum) about whether any other tutors have suggestions or ideas.

9. Visit other forums

Every tutor does things slightly different; no tutor is exactly the same, and this is a good thing.  If you tutor on a module, there’s a possibility that you might be able to view another tutor’s discussion forum.  If you have the time, do visit another tutor’s forum.  Some good questions to ask are: ‘at a glance, do the students look engaged?’ or ‘how busy is this forum?’  Other questions might be, ‘what exactly is the tutor doing?’ and ‘ how are they asking questions?’  This allows you to get a view on how well a forum is being run.  When you see a busy and well run forum, ask the question: ‘is the tutor doing something special here?’  If so, what is it?  Sometimes, of course, certain cohorts can just be pretty quiet; some years are more busy than others.

After visiting a forum, the best questions to ask are, ‘what have I learnt?’ and ‘is there anything that I could or should be doing with my forum?’

10. Form forum habits

The more that you’re active on a forum, the more useful a forum can become for students.  Find some time, every day, or every couple of days, to read through and respond to forum posts.  This will keep your forums fresh and alive; they may even acquire a ‘stickiness’ of their own and become pages that students are drawn to time and time again.

Summary

This quick blog post summarises a number of ‘forum tips’ that I’ve discovered over the last couple of years of working with different modules.  Some of these ideas have, of course, been shaped by the e-moderating book that I have mentioned earlier.  E-moderating is a book that is useful for some modules and not others since different modules use forums in slightly different ways.  Although a module team might use a forum in a particular way, it is always going to be up to you, a tutor, to take ownership of this important learning space.

Finally, if you would like to add to these tips (or even disagree with them), please don’t hesitate to let me know!

Permalink 1 comment (latest comment by Ravina Talbot, Sunday, 3 Feb 2019, 15:44)
Share post
Christopher Douce

eSTEeM Conference – Milton Keynes, May 2014

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 20 May 2014, 09:50

On 5 May 2014, I was at Milton Keynes again.  I had something called a module team meeting in the morning.  In the afternoon I attended an OU funded conference that had the title (or abbreviation) eSTEeM (project website).

eSTEeM is an initiative to conduct research into STEM education.  STEM is an abbreviation for Science Technology Engineering and Mathematics.  Since I have some connections with some computing modules, which can cross the subjects of Engineering and Mathematics, I decided to submit a proposal that had the objective of learning more about the tutor’s experience of teaching computer programming.  The aim was simple: if we learn more about the tutor’s challenges, we can support them better, and subsequently they will be able to support the students better too.

I have been lucky enough to receive a small amount of funding from the university.  This, of course, is great news, but it also means that I’ve got even more work to do!  (But I’m not complaining - I accept that it’s all self-inflicted, and it’s work that will allow us to get at some insights).  If you’re interested, here’s some further information about the project (eSTEeM website).

A 'pilot' project

The ‘understanding the tutors and the students when they do programming’ project is a qualitative study.  In this case, it means that I’ll be analysing a number of interviews with tutors.  I’ll be the first to admit that it’s been quite a while since I have done any qualitative research, so I felt that I needed to refamiliarise myself with what I needed to do by, perhaps, running a pilot study.

It wasn’t long before I had an idea that could become a substantive piece of research in its own right. I realised that there was an opportunity to run a ‘focus group’ to ask tutors about their experience of tutoring on another module: T320 Ebusiness Technologies (OU website).  The idea was that the outcome from this study could feed directly into discussions about a new module.

During my slot at the conference, rather than talking about my research about programming (which was still at the planning stage), I talked about T320 research, that was just about finished.  I say finished, when what I actually mean is ‘transcribed’; there is a lot more analysis to do.  What has struck me was how generous tutors are with both their opinions and their time.  Their views will really help when it comes to designing and planning the future module that I have a connection.

Final thoughts and links

In case you’re interested, here’s a link to the conference programme.

What struck me was how much ‘internal research’ was there was going on; there are certainly a lot of projects to look through.  From my perspective, I’m certainly looking forward to making a contribution to the next conference and sharing results from the web technologies and programming research project with colleagues.   The other great thing about getting my head into research again is that when you have one idea about what to look at, you suddenly find that get a whole bunch of other ideas.

Permalink Add your comment
Share post
Christopher Douce

Widening Participation through Curriculum Conference (day 2 of 2)

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 24 Apr 2019, 17:29

The second day of the conference was to be slightly different to the first; there were fewer sessions, and there were a number of ‘talking circle’ workshop events to go to.  On the first day I arrived at the conference ridiculously early (I was used to the habit of travelling to Milton Keynes in time for meetings, and catching a scheduled bus to the campus).   On the second day, I was glad to discover that I wasn’t the first delegate to arrive.

Opening remarks

The second day was opened by Professor Musa Mihsein from the OU.  He presented an interesting story of how he became to work at the university as a PVC.  Musa talked about changes to funding, making the point that there has also been a change in the use of language.  There is more of a need to ‘maximise impact’.  The accompanying question is, of course, ‘how can we best evaluate projects and programs?’

A couple of points I noted down was that we haven’t got a full understanding of curriculum and its role within the institution, and that collaborations are important.  There is also a continual need to communicate in different ways to policy makers.

Keynote 4: Liberating the curriculum

The first keynote of the day was by Kelly Coate, Senior Lecturer in Higher Education, from Kings College, London.  Kelly’s talk was interesting since it spoke directly to the ‘curriculum’ part of conference title.  She has been researching about curriculum for the last 20 years and made the point that, ‘decisions about curriculum are decisions about what we can think’ (if I’ve taken that down correctly).

Here’s some of my notes: we’re accustomed to certain view of what ‘curriculum’.  The word derives from a Latin word that means to run/to proceed.  This makes a lot of sense: most participants make it to the finish line, there are often a couple of really high scorers and a couple who are, perhaps, left behind. 

If we dig around in history, the notion of curriculum used to be associated with the ‘liberal arts’.  This contains the disciplines of grammar, logic, rhetoric, music theory, astronomy, arithmetic, and geometry, with the word liberal being derived from libra, meaning ‘free’.

Kelly’s talk gave way an interesting twist.  Since she studies what people are studying, she was asked to comment on a story that Miley Cyrus was to be the subject of a university course.  If you’re interested, here’s a related news story: Back to twerk … Miley Cyrus to be studied on new university course (The Guardian).   Thinking about it for a moment, the subject of Miley can readily be used to facilitate discussions about femininity, power, exploitation, celebrity,sexuality…

A bit of theorising is always useful.  We could thing about curriculum in three different domains: knowing, acting and being. Importance of relating teaching to the now, which opens up the possibility of students considering suggesting their own curricula by performing research into how ‘the now’ relates to the broad subject area.

Another way of thinking about curriculum might be in terms of gravity and density.  Gravity is the extent to which a subject can be related to a particular context.  Density relates to how much theory there is (some subject can be incredibly theoretical).  I really like these metaphors: they’re a really good (and powerful) way to think about how a lecturer or teacher might be able to ‘ground’ a particular concept or idea.

We were briefly taken through a couple of ideas about learning and pedagogy.  The first one was the transmission model (which, I think, was described as being thoroughly discredited), where a lecturer or teacher stands in the front of the class and talks, and the students magically absorb everything. The second idea (which I really need to take some time out to look at) is actor-network theory (wikipedia).  Apparently it’s about thinking about systems and networks and how things are linked through objects and connections.  (This is all transcribed directly from my notes - I need to understand in a whole lot more than I do at the moment!)

I’ve also made a note about a researcher called Jan Nespor  who has applied actor-network theory to study physics and business studies classes.  The example was that lecturers can orchestrate totally different experiences, and these might be connected with the demands and needs of a particular discipline (if I’ve understood things correctly!)

I’ve made a note of some interesting points that were made by the delegates at the end of Kelly’s speech.  One point was that different subjects have different cultures of learning, i.e. some subjects might consider professional knowledge to be very important.  Musa mentioned the importance of problem-based learning, particularly in subjects such as engineering. 

Session 3: Innovation in design and pedagogy

There was only one presentation in the third session which was all about pedagogy.  This was entitled ‘Creating inclusive university curriculum: implementing universal design for learning in an enabling programme’, by Stuart Dinmore and Jennifer Stokes.  The presentation was all about how to make use of universal design principles within a module.  We were introduced to what UD is (that it emerges from developments in design and architecture), that it aims to create artefacts that are useful for everyone, regardless of disability.

Connecting their presentation to wider issues, there are two competing (yet complementary) accessibility approaches: individualised design and universal design.  There is also the way in which accessibility can be facilitated by the use of helpers, to enable learners to gain access to materials and learning experiences.

It was great that this presentation explicitly spoke to the accessibility and disability dimension of WP, also connecting to the importance of technology.  During Stuart and Jennifer’s presentation, I was continually trying to relate their experiences with my own experience of tutoring on the OU module, H810 Accessible online learning: supporting disabled students (OU web page)

Talking circle

I chose to attend innovation in design and pedagogy.  I do admit that I did get a bit ‘ranty’ (in a gentle way) during this session.  This was a good opportunity to chat about some of the issues that were raised and to properly meet some of the fellow delegates.  Some of the views that I expressed within this session are featured in the reflection section that follows.

Closing keynote:  class, culture and access to higher education

The closing keynote was by John Storan from the University of East London.  John’s keynote was a welcome difference; it had a richly personal tone.  He introduced us to members of his family (who were projected onto a screen using PowerPoint), and talked us through the early years of his life, and his journey into teacher training college, whilst constantly reflecting on notions of difference.

He also spoke about a really interesting OU connection too.  John was a participant in a study that gave way to a book entitled, Family and kinship in East London (Wikipedia), by Michael Yong and Peter Willmott.  (This is one of those interesting looking books that I’m definitely going to be reading – again, further homework from this conference).  ‘We were the subject’, John told us.  He also went onto make the point about the connections between lived experience, research, policy and curriculum.

I’ve made a note in my notebook of the phrase, ‘not clever, able enough’.  I have also been subject to what I now know to be ‘imposter syndrome’.  In the question and answer session, I’ve made a note about that the codes of language can easily become barriers.

Reflections

One of the really unexpected things about this conference was the way that it accidentally encouraged me to think about my own journey to and through higher education.  Although for much of my early life I didn’t live in an area that would feature highly in any WP initiatives, higher education was an unfamiliar world to my immediate family.

Of course, my journey and my choices end up being quite nuanced when I start to pick apart the details of my biography, but I think there was one intervention that made a lasting impression.  This intervention was a single speech given by a member of staff at my former college about the opportunity that university study gave.  I remember coming away thinking, ‘I’m going to apply; I have nothing to lose, and everything to gain’.  A number of my peers thought the same.

The conference presented a number of different perspectives: the importance of assessing the effectiveness of interventions and the importance of theory, how to design WP curriculum, how to make curriculum accessible, and how to make materials engaging for different groups.  One aspect that I thought was lacking was that of the voices of the students.  It’s all very well discussing between ourselves what we think that we should be doing, but I felt it would be really valuable to hear the views of students. 

An area that would be particularly useful is to hear about instances of failure, or to hear about what went wrong when students tried university level study but couldn’t complete for some reason.  There are some really rich narratives that have the potential to tell researchers in WP and curriculum a lot about what institutions (and individuals) need to do.  The challenge, of course, is finding those people who would like to come forward and share their views.

In the sessions that I attended, there were clear discussions about class, socio-economic status and disability, but there seemed to be an opportunity to discuss more about ethnicity.  Quantitative research has shown that there is an attainment gap.   There was an opportunity for some qualitative discussions and more sharing of views regarding this subject.

Another thought relates to the number of keynote speeches.  Keynote speeches are really important, and it was great that they were varied – and they are very important in tone and agenda setting, but more paper sessions (and perhaps a plenary discussion?) might expose different issues and allow more contacts to be made.

I appreciate that these final reflections sound a bit ‘whingey’; they’re not intended to be.  WP is an important issue, and from the amount of follow-up homework I’ve got to do this clearly tells me that the conference was a great success. 

In some ways I guess the conference was slightly different to what I had expected (in terms of the debate and discussions).  I was expecting it to be slightly less ‘academic’ and slightly more practitioner focussed (or oriented to those who deal with WP issues on a day to day basis).   The unexpected difference, however, was very welcome; I’ve learnt some new stuff.

Permalink Add your comment
Share post
Christopher Douce

Widening Participation through Curriculum Conference (day 1 of 2)

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 24 Apr 2019, 17:31

There are some days when I feel very lucky; lucky in the sense that my transition from school, to college and to university happened pretty painlessly.  Although my background has been far from privileged, I feel that I ended up making the right choices the exactly right time, all by accident rather than by design.

Some of these thoughts were going through my head as I walked towards the hotel where the Widening Participation through Curriculum conference was held.  Other thoughts were connected with my day job, which all about supporting the delivery of a range of undergraduate computing and ICT modules.  WP (as it seemed to be known within the conference), is something that I consider to be fundamentally important; it touches on my interactions with students, and the times that I work with members of a module team.  I also had a question, which was, ‘what more could I do [to help with WP]?’

This post is a summary of my own views of the Widening Participation through Curriculum conference that was held on two days from 30 April 2014 in Milton Keynes.  It’s intended as a rough bunch of notes for myself, and might be of distant interest to other delegates who were at the event (or anyone else who might find these ramblings remotely interesting).

Opening remarks

The opening address was by Martin Bean, Vice chancellor of the university.  He asked the question, ‘how do we ensure that widening participation is achieved?’  This is an easy question to ask, but a whole lot more difficult to answer.  Martin talked about moving from informal to formal learning, and the challenge of reaching out and connecting with adult learners in a sustainable way.  Other points included the importance of access curriculum (pre-university level study).  Access curriculum has the potential to encourage learners and to develop confidence.

Martin also touched upon the potential offered by MOOCs, or, massive open online course.   The OU has created a company called FutureLearn, which has collaborations with other UK and international universities.  A question is whether it might be possible to create level 0 (or access) courses in the form of MOOCs that could help to prepare learners for formal study (connecting back to the idea of transitions from informal to formal learning).  One thought that I did have is about the importance and use of technology.  Technology might not be the issue, but figuring out strategies to use it effectively might be.

Keynote 1: WP and disruption – global challenges

The first keynote of the conference was by Belinda Tynan, PVC for teaching and learning.  As she spoke, I made some rough notes, and I’ve scribbled down the following important points: models of partnerships, curriculum theory, impact of curriculum reform, and how students are being engaged.

Belinda touched upon a number of wide issues such as changing demographics, discrepancy between rich and poor, unemployment, and the relationship between technology and social inclusion; all really great points.

Another interesting point was about the digital spaces where the university does not have a formal presence.  We were told that there are in the order of 150 Facebook groups that students have set up to help themselves.  As an aside, I’ve often wondered about these spaces, and whether they can tell us something that the university could be doing better, in terms of either technology, interactive system design, or how to foster and develop collaboration.  Another thought relates to the research question: how much learning actually occurs within these spaces?   How much are we able to see?

A phrase that jumped out at me was, ‘designing curriculum that fits into people’s lives’.  Perhaps it is important that curriculum designers create small fragments of materials to allow students can manage the complexity of their studies.  Other key phrases include the importance of motivation, the role of on-line discussions, and the challenge of finding time.

We were shown a short video about learning analytics.  Learning analytics is a pretty simple concept.  Whenever we interact with a system, we leave a trace (often in the form of a web request).  The idea is the perhaps the sum total of traces will be able to tell us something about how students are getting along.  By using clever technology (such as machine learning algorithms), it might be possible to uncover and initiate targeted interventions, perhaps in collaboration with student support teams.

One thought that I had during this presentation was, ‘where is the tutor in this picture?’  Technology was mentioned a lot, but there was little mention about the personal support that OU tutors (or lecturers) offer.   There are many factors in helping students along their journey, and my own view is that tutors are a really important part of this.

The concluding points in Belinda’s keynote (if I’ve noted this down properly) return to the notion of challenges: the importance of the broader societal context, and the importance of connecting learning theory to student journeys.

Session 1: Measuring and demonstrating impact

Delegates could go to a number of parallel sessions about different topics.  The first paper session I dropped into was entitled ‘measuring and demonstrating impact’.  This session comprised of two presentation.

The first presentation was entitled, ‘Impact of a pre-access curriculum on attainment over 10 years’, and it was from representatives of an organisation called Asdan Education, which is a charity which grew out of research from the University of West of England.  I hadn’t heard of this organisation before, so all this was news to me.  Asdan have what is called Certificate of personal effectiveness (Asdan website).  The presentation contained a lot of data suggested that the curriculum (and the work by the charity) led to an improvement to some GCSE results.

The second presentation of the morning, given by Nichola Grayson and Johanna Delaney was entitled, ‘can the key principles of open skills training enhance the experience of prospective students’. Interestingly, Nichola and Johnanna were from the library services at the University of Manchester.  Their talk was all about revision of library resources called ‘my learning essentials’.

The university currently has something called a ‘Manchester access programme’, which includes visits from schools, and an ‘extended project qualification’ (which I think allows students to gather up some UCAS points, used for university entry).  The open new training programme (if I have understand it correctly) has an emphasis on skills, adopts a workshop format and makes use of online resources.

During this presentation, I was introduced to some new terms and WP debates.  I heard the concept of the ‘deficit model’ for the first time, and there were immediate comments about its appropriateness (but more of this problematic concept later).

Session 2: Theory revisited

I went to this session because I had no idea what ‘theory’ means in the context of Widening Participation; I was hoping to learn something!

The first presentation was by my colleague Jonathan Hughes who gave a presentation entitled, ‘developing a theoretical framework to explore what widening participation has done for ‘non-traditional students’ and what it has done to them.’  Jonathan and his colleague Alice Peasgood has been interviewing WP experts, which includes mostly professors who had been published.  Interviews recorded and transcribed, and then analysed.

Johnathan made an interesting comment (or quip) that this is a technique that can be considered to be a short-cut to a literature review.  This is an idea that I’m going to take away with me, and it has actually inspired some thinking about an idea about how to understand the teaching of programming.

His analysis is to use a technique called thematic analysis (Wikipedia) drawing on the work of Braun and Clarke.  This was also interesting: in terms of qualitative research, I’m more familiar with grounded theory (Wikipedia).  This alerted me to one of the dangers of going to conferences: that you can easily give yourself lots of homework to do.

Jonathan highlighted three main themes: the policy context (tuition fees in higher education), wider context of marketised higher education, and how policies are interpreted and operationalised.  (He has written more about these in his paper).  I’ve made a note of a comment that there are different theoretical frameworks to understand WP: one is to enable the gifted and talented to study, another is how best to meet the needs of employers, and how to transform the university rather than the students.

The second talk by Jayne Clapton, was entitled, ‘seeing a ‘complex’ conceptual understanding of WP and social inclusion in HE’.  Jane presented a graphic of a metaphor of a complex mechanism which had a number of interlocking parts (which, I believe, represent various drivers and influences).

The discussion section was really interesting, particularly since the deficit model was attacked pretty comprehensively.  To add a bit more detail, the ‘model’ is where potential students have some kind of deficit, perhaps in terms of socio-economic background, for instance.  To overcome this there is the idea of having some kind of intervention done to them to help prepare them for higher education.  An alternative perspective is to view students in terms of ‘assets’; development opportunities can represent investments in individuals.

A concluding discussion centred upon the importance of research.  Research always has the potential to inform and guide government policy.  The point was that ‘we need effective research to back up any arguments that we make, and we need to know about the effectiveness about projects or interventions’.

Keynote 2: The ‘academic challenge’ in HE: intersectional dimensions and unintended affects on pedagogic encounters

The second keynote was by Professor Gill Crozier from Roehampton University.  I’ve made a note that Gill was talking about transition; that the transition to higher education is more difficult for working class, and black and ethnic minority students.  Some students can be unsure what university was all about (I certainly place myself in that category).  Studying at university can expose students to unequal power relations between class, gender and race.

A really interesting point that I’ve noted down is one that relates to attitude.  In some cases, some lecturers are not happy giving additional support, since this requires them to ‘become nurturing’ in some senses, and some might consider it to beyond the remit of their core ‘academic’ duties.  I personally found this view surprising.  I personally view those moments of additional support as real opportunities to help learners find the heart of a discipline, or get to the root of a problem that might be troublesome.  These moments allow you to reflect on and understand core ideas within your own discipline.  In comparison to lecturing in front of a room, you need to be dynamic; you need to get to the heart of the problem, and try your best to be as engaging as possible.   I also made a note about the importance of creating a ‘learner identity’.

There was a lot in terms of content in this presentation.  Two interesting notes that I made in my notebook are, ‘social identifies profoundly shape dispositions’ (I’m not quite sure what context I’ve written this), and ‘little attention given to the experience of students at university’ (which is something that I’ll come back to in the final part of this blog).

Keynote 3: Widening success through curriculum: innovation in design and pedagogy

Stephanie Marshall, CEO of the Higher Education Academy (HEA website) gave the third keynote speech.  Stephanie began with an interesting anecdote, and one that I really appreciated.  Stephanie spoke about her early days of being a lecturer at (I think) the University of York.  She spoke to a colleague who apparently told her that ‘the OU had taught me to do all this’, meaning, how to become a lecturer by running training sessions that allows associate lecturers to understand how to run group sessions, and how to choose and design effective activities.

My ears pricked up when Stephanie mentioned the HEA’s Professional Standards Framework (HEA website).  The UKPSF relates to the HEA’s accreditation process where lecturers have to submit cases to demonstrate their teaching and learning skills in higher education.

Like so many HE institutions, the HEA has also been through a period of substantial change, which has recently included a substantial reduction in funding.  This said, the HEA continues to run projects that aim to influence the whole of the sector.  Work streams currently include curriculum design, innovative pedagogies, transitions, and staff transitions (helping staff to do the things that they need to do).

There are also projects that relate to widening participation.  One that I’ve explicitly taken a note of is the retention and success project (HEA website) (it appears that there’s a whole bunch of interesting looking resources, which I didn’t know existed).  Other projects I’ve noted connect to themes such as attainment and progression, learning analytics and employability.

On the subject of WP, Stephanie gave a really interesting example.  During the presentation of a module, students studying English at one university expressed concerns about the relevance of particular set text to the students who were studying them.   This challenge led to the co-development of curriculum, a collaboration between students and lecturers to choose text that were more representative (in terms of the ethnicity of the student body), thus allowing the module to be more engaging.  This strikes me as one of the fundamental advantages of face to face teaching; lecturers can learn, and challenging (and important) debates can emerge.

A final resource (or reference) that I wasn’t aware of was something called the Graduate attributes framework (University of Edinburgh).  Again, further homework!

Permalink Add your comment
Share post
Christopher Douce

OU e-learning Community – Considering Accessibility

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 4 May 2014, 17:36

On April 23, I visited the Open University campus to attend an event to share lessons about how the university can support students who have disabilities. The event, which took place within a group called the ‘e-learning community’ had two parts to it: one part was about sharing of research findings, and the other part was about the sharing of practice.

This blog aims to summarise (albeit briefly) the four presentations that were made during the day.  It’s intended for a couple of audiences: colleagues within the university, students who might be taking the H810 accessible online learning (OU website) module that I tutor, and anyone else who is remotely interested.

Like many of these blogs, these event summaries are written (pretty roughly) from notes that I made during the sessions.  (This is a disclaimer to say that there might be mistakes and I’m likely to have missed some bits).

Academic attainment among students with disabilities in distance education

Professor John Richardson, from the OU’s Institute of Educational Technology gave the first presentation of the day.  John does quantitative research (amongst a whole load of other things), and he began by staying that there is an increase in knowledge about our understanding of the attainment of students who have disabilities, but the knowledge is fragmented.  John made a really important point, which was that it is patent nonsense to consider all disabled students as a single group; everyone is different, and academic performance (or attainment) is influenced by a rich combination of variables.  These include age, gender, socio-economic status, prior qualifications (and a whole bunch of others too).

When we look at qualitative data, it’s important to define what we’re talking about.  One of the terms that John clearly defined was the phrase ‘a good degree’.  This, I understand, was considered to be a first or an upper second class honours degree.  John also mentioned something that is unique about the OU; that it awards degree classifications by applying an algorithm that uses scores from all the modules that contribute towards a particular degree (whereas in other institutions, the classification comes from decisions made by an examination board).

We were given some interesting stats.  In 2009 there were 196,405 registered students, of which 6.8% of students declared a disability.  The most commonly disclosed disability was pain and fatigue, followed by dyslexia.  Out of all disabled students, 55% of students declared a multiple disability.

In 2012 the situation was a little different. In 2012 there were 175,000 registered students, of which 12% (21,000) students declared (or disclosed) a disability.  John said that perhaps this increase might be an artifact of statistics, but it remains a fact.  He also made the point (raised by Martyn Cooper, a later speaker on the day) that this number of students represents the size of an average European university.  From these statements I personally concluded that supporting students with disabilities was an activity that the university needs to (quite obviously) take very seriously.

If I’ve got this right John’s research drew upon a 2009 data set from the OU.  There were some interesting findings.  When controlling for other effects (such as socio-economic class, prior qualification and so on), students who had declared pain and fatigue and autistic spectrum disorders exhibited greater levels of gaining good degrees that non-disabled students.  Conversely, students who had disclosed dyslexia, specific learning disabilities or multiple disabilities gained a lower percentage of good degrees when compared with non-disabled students.

I’ve made a note of a couple of interesting conclusions.  To improve completion rates, it is a good idea to somehow think about how we can more readily support students who have disclosed mental health difficulties and mobility impairments.  To improve degree levels, we need to put our focus on students who have disclosed dyslexia and specific learning disabilities.  One take away thought relates to the university’s reliance on text (which is a subject that crops up in a later presentation).

Quantitative research can only tell us so much; it can tell us that an artifact exists, but we need to use other approaches to figure out the finer detail.  Qualitative research, however, can provide detail, but the challenge with qualitative approaches lies with the extent to which findings and observations can be generalised.  My understanding was that we need both to clearly create a rich picture of how the university supports students with disabilities. 

Specific learning differences, module development and success

The second presentation was a double act by Sarah Heiser (a colleague from the London region), and Jane Swindells, who works in the disability advisor service.  Jane introduced the session by saying that it was less about research and more about sharing a practitioner perspective.  I always like these kind of sessions since I find it easy to connect with the materials and I can often pick up some useful tips that you can use within your own teaching.

An important point is that dyslexia has a number of aspects and is an umbrella term for a broader set of conditions.  It can impact on different cognitive processes, such as the use of working memory, speed of information processing, time management, co-ordination and automaticity of operations.  It can also affect how information is received and decoded. 

On-line or electronic materials offer dyslexic learners a wealth of advantages; materials can be accessed through assistive technologies, and users can personalise how content is received or consumed.  An important point that I would add is that the effectiveness of digital resources depends on the user being aware of the possibilities that it gives.  Developing a comprehensive awareness of the strategies of use (to help with teaching and learning) is something that takes time and effort.

Sarah spoke about a project where she has been drawing out practice experience from associate lecturers through what I understand to be a series of on-line sessions (I hope I’ve understood this correctly).  Important themes to include challenges that accompany accuracy, text completion, following instructions, time, and the importance of offering reassurance.

I’ve made a note of the term ‘overlearning’.  When I had to take exams I would repeat and repeat the things I had to learn, until I was sick of them.  (This is a strategy that I continue to use to this day!)

One point that I found especially interesting relates to the use of OU live recordings.   If a tutor records a session, a student who may have dyslexia can go over them time and time again, choosing to pick up sections of learning at a time and a pace that suits them.  This depends on two points: the first is the availability of the resource (tutors making recordings), and students being aware that they exist and know how they can access them.

Towards the end of the session, Sarah mentioned a tool called Language Open Resources on-line, or LORO for short.  LORO allows tutors to share (and discover) different teaching resources.  I was impressed with LORO, in the sense that you can enter a module code and find resources that tutors might (potentially) be able to use within their tutorial sessions.

SeGA guidance: document accessibility/accessible methods and other symbolic languages

The third presentation of the day was from Martyn Cooper, from the Institute of Educational Technology.  Martyn works as a Senior Research fellow, and he has been involved with a university project called SeGA, known as Securing Greater Accessibility.  A part of the project has been to write guidance documents that can help module teams and module accessibility specialists.  An important point is that each module should have a designed person who is responsible for helping to address accessibility issues within its production.  (But, it should also be argued that all members of a module team should be involved too).

The documents are intended to provide up to date guidance (or, distilled expertise) to promote consistency across learning resources. The challenge with writing such guidance is that when we look at some accessibility issues, the detail can get pretty complicated pretty quickly.

The guidance covers a number of important subjects, such as how to make Word documents, PDFs, and pages that are delivered through the virtual learning environment as accessible as possible.  Echoing the previous talk, Martyn made the point that electronic documents have inherent advantages for people who have disabilities – the digital content can be manipulated and rendered in different ways.

Important points to bear in mind include the effective use of ALT texts (texts that describe images), the use of scalable images (for people who have visual impairments), effective design of tables, use of web links, headings and fonts.  An important point was made that it’s important to do ‘semantic tagging’, i.e. design a document using tags that describe its structure (so it becomes navigable), and deal with its graphical presentation separately.

I noted down an interesting point about Microsoft Word.  Martyn said that it is (generally speaking) a very accessible format, partly due to its ubiquity and the way that it can be used with assistive technologies, such as screen readers.

Martyn also addressed the issue about how to deal with accessibility of mathematics and other symbolic notations.  A notation system or language can help ideas to be comprehended and manipulated.  An important point was that in some disciplines, mastery of a notation system can represent an important learning objective.  During Martyn’s talk, I remembered a lecture that I attended a few months back (blog) about a notation scheme to describe juggling.  I also remember that a good notation can facilitate the discovery of new ideas (and the efficient representation of existing ones).

One of the challenges is how to take a notation scheme, which might have inherently visual and spatial properties and convert it into a linear format that conveys similar concepts to users of assistive technologies, such as screen readers.  Martyn mentioned a number of mark-up languages that can be used to represent familiar notations: MathML and ChemML (Wikipedia) are two good examples.  The current challenge is these notations are not supported in a consistent way across different internet browsers.  Music can be represented using something called music braille (but it is also a fact that only a relatively small percentage of visually impaired people use braille languages), or MIDI code.

A personal reflection is that there is no silver bullet when it comes to accessibility.  Notation is a difficult issue to grapple with, and it relies on users making effective use of assistive technologies.  It’s also important to be mindful that AT, in itself, can be a barrier all of its own.  Before one can master a notation, one may well have to master a set of tools.

The question and answer session at the end of Martyn’s talk was also interesting.  An important point was raised that it’s important to embed accessibility into the module production process.  We shouldn’t ‘retrofit’ accessibility – we should be thinking about it from the outset.

Supporting visually impaired students in studying mathematics

The final presentation of the day was by my colleague Hilary Holmes, who is a maths staff tutor.  A comment that I’ve made (in my notebook) at the start of Hilary’s presentation is that the accessibility of maths is a challenging problem.  Students who are considering studying mathematics are told (or should be told) from the outset that maths is an inherently visual subject (which is advice that, I understand, is available in the accessibility guide for some modules).

Key issues include how to describe the notation (which can be inherently two dimensional), how to describe graphs and diagrams, how to present maths on web pages, and how to offer effective and useful guidance to staff and tutors.

First level modules make good use of printed books.  Printed books, of course, present fundamental accessibility challenges, so one solution to the notation (and book accessibility) issue is to use something called a DAISY book, which is a navigable audio book.  DAISY books can be created with either synthesised voices, or recorded human voices.  The university has the ability to record (in some cases) DAISY books through a special recording facility, which used to be a part of disabled student services.  One of the problems of ‘speaking’ mathematical notation is that ambiguities can quickly become apparent, but human readers are more able to interpret expressions and add pauses and use different tones to help convey different meanings.

Another approach is to use some software called AMIS (AMIS project home), which is an abbreviation for Adaptive Multimedia Information System. AMIS appears to be DAISY reader software, but it also displays text.

Diagrams present their own unique challenges.  Solutions might be to describe a diagram, or to create tactile diagrams, but tactile diagrams are limited in terms of what they can express.  Hilary subjected us all to a phenomenally complicated audio description which was utterly baffling, and then showed us a complex 3D plot of a series of equations and challenged us with the question, ‘how do you go about describing this?’  I’ve made a note of the following question in my note book: ‘what do you have to do to get at the learning?’

Another approach to tackle the challenge of diagrams is to use something called sonic diagrams.  A tool called MathTrax (MathTrax website) allows users to enter in mathematical expressions and have them converted into a sound.  The pitch and character of a note change in accordance with values that are plotted on a graph.  Two important points are: firstly, in some instances, users might need to draw upon the skills of non-medical helpers, and secondly (as mentioned earlier), these tools can take time to master and use.

A final point that I’ve noted down is the importance of offering tutors support.  In some situations, tutors might be unsure what is meant by the phrase ‘reasonable adjustment’, and what they might be able to do in terms of helping to prepare resources for students (perhaps with help from the wider university).  Different students, of course, will always have very different needs, and it is these differences that we need to be mindful of.

It was really interesting to hear that Hilary has been involved with something called a ‘programme accessibility guide’.  This is a guide about the accessibility of a series of modules, not just a single module.  This addresses the problem of students starting one module and then discovering that there are some fundamental accessibility challenges on later modules.  This is certainly something that would be useful in ICT and computing modules, but an immediate challenge lies with how best to keep such a guide up to date.

Reflections

It was a useful event, especially in terms of being exposed to a range of rather different perspectives and issues (not to mention research approaches).  The presentations went into sufficient detail that really started to highlight the fundamental difficulties that learners can come up against.  I think, for me, the overriding theme was about how best to accommodate differences.  A related thought is that if we offer different types of resources (for all students), there might well be a necessity to share and explain how different types of electronic resources and documents can be used in different ways (and in different situations).

The Languages Open Resources Online website was recently mentioned in a regional development conference I attended a month or two back.  Sarah’s session got me thinking: I wondered whether it could be possible to create something similar for the Maths Computing and Technology faculty, or perhaps, specifically for computing and ICT modules (which is my discipline).  Sharing happens within modules, but it’s all pretty informal – but there might be something said for raising the visibility of the work that individual tutors do.   One random through is that it could be called: TOMORO, with the first three letters being an abbreviation for: Technology Or Mathematics. There are certainly many discussions to be had. 

 

Permalink Add your comment
Share post
Christopher Douce

Social media toolkit workshop: Milton Keynes

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 8 Apr 2014, 15:55

26 March was another busy day.  In the morning I had managed to get myself onto something called a ‘social media toolkit workshop’.  In the afternoon, I had to go to a M364 Interaction Design (Open University) module team meeting.  This is a quick summary (taken from my paper-based analogue notes) of the workshop.  I should mention that I had to bale out of it early due to the other meeting commitment, so I wasn’t able to benefit from some of the closing discussions.  Nevertheless, I hope what is here might be of use to someone (!)

Objective

The university has created something called a social media toolkit which could be used by any academic (or any other group within the university) who might have an interest in using social media to share stories about projects or outcomes from research.  It is designed to be useful for those who are new to social media, as well as those who have a bit more experience. 

If you’re reading this from internally within the university, you might be able to access an early version of the toolkit (OU Social Media Toolkit).  In essence, the toolkit contains resources about how to capture and use different types of digital media, such as audio recordings, geo locations (or geodata), photos, text or video. The kit also aims to (as far as I understand) to offer examples of how these different types of media could be used within an academic context.

The objective of the day was to introduce the toolkit to a group of interested participants to gather up some views about how it might be potentially enhanced, developed or improved.  Since I could only stick around for a part of the day, I was only able to attend the first part of the day, which comprised of a forceful and evangelical presentation by Christian Payne, who runs a website (or social media hub) called Documentally.

The following sections have been edited together from the notes that I made on the day.

Social media and stories

Our presenter was very good at sharing pithy phrases.  One of the first that I’ve noted down is the phrase: ‘your story is your strategy about what you want to share’.  In retrospect, this phrase is a tricky to unpack, but your strategy might well be connected to the tools that you use, and the tools might well connect to the types of media that you are able (or willing) to produce.

During the first session we were told about different tools.  Some tools were immediately familiar, such as Twitter and YouTube, but there were others that were more niche and less familiar, such as Flickr, FourSquare, Audioboo and Bamboozer.  (A point was made that that YouTube can now be considered to be the webs second biggest search engine).  Another interesting point (or strategy, or technique) was that all tools should be focused towards a hub, perhaps a website (or a blog).  This isn't a new idea: this blog connects up to my OU website, which also had a feed of recent publications.

Here are some more phrases I've noted.  It’s important to get stories seen, heard and interacted with, and ‘a social network is the interaction between a group of people who share a common interest’. 

A really interesting phrase is ‘engineering serendipity’; ‘serendipity lives in the possibility of others discovering your materials’.  The point is that it’s all about networks, and I can clearly sense that it takes time and effort to create and nurture those networks.

The power of audio

An area that was loosely emphasised was audio recordings.  Audio, it is stated, connects with the ‘theatre of the mind’ (which reminded me of a quote or a saying that goes, ‘radio has much better pictures than television’).  Audio also has a number of other advantages: it is intimate, and you can be getting along with other things at the same time on your device whilst you listen to an audio stream.  Christian held the view that ‘photoslide sharing can create better engagement than videos’.

There was a short section of the morning about interview techniques: start easy and then probe deeply, be interested, take time to create rapport and take the listener on a journey.  Editing tools such as GarageBand and Audacity were touched upon, and a number of apps were mentioned, such as Hokusai and SoundCloud (that allows you to top and tail a recording).

Audio recordings can be rough and ready (providing that you do them reasonably well).  Another point was: ‘give me wobbly video, or professional video, but nothing in between’.  I made a note that perhaps there is something authentic about the analogue world being especially compelling (and real) if it is presented in a digital way.  In a similar vein, I’ve also noted (in my analogue notebook) ‘if you throw out a sketch, people are drawn to it’ (and I immediately start thinking about a TEDTalk that I once saw that comprised of just talking and sketching – but I can’t seem to find it again!)

Here are two other phrases: ‘good content always finds an audience, but without context it’s just more noise’, and, ‘you can control your content, but not how people react to it’.  Whilst this second quote is certainly true, this connects to an important connected point about using the technology carefully and responsibly.

A diversion into technology

During the middle of the presentation part of the workshop, we were taken on a number of diversions into technology.  We were told about battery backups, solar powered mobile chargers and the importance of having set of sim cards (if you’re going to be travelling in different countries).  Your choice of devices (to capture and manipulate your media) is important.  Whilst you can do most things on a mobile phone, a laptop gives you that little bit more power and flexibility to collate and edit content.

We were also told about networking tools, such as PirateBox, which is a bit like a self-contained public WiFi internet in a box, which can allow other people (and devices) to connect to one another and share files without having to rely on other communications networks.

The structure of stories

Putting the fascinating technology aside, we return to the objective of creating stories through social media.  So, what are stories?  Stories, it is argued, have a reveal; they grab your attention.  It’s also useful to say something about the background, to contextualise a setting.  A story is something that we can relate to.  It can be a tale that inspires or makes us feel emotional.

We were told that a story, in its simplest form, is an anecdote, or it’s a journey.  An important element is about the asking of questions (who, what, when, when, how), followed by a pay-off or resolution.  But when we are using many different tools to create different types of media, how do we make sense of it all?  We’re again back to the idea of a hub website.  A blog can operate as a curation tool.  It can become an on-line repository for useful links, notes and resources.

Reflections

The workshop turned out to be pretty interesting, and our facilitator was clearly a very enthusiastic about sharing a huge amount of his life online.  There, I feel, lies an issue that needs to be explored further: the distinction between using these tools to share stories about your research (or projects), and how much of yourself you feel comfortable sharing.  I feel that, in some occasions, two can become intertwined (since I personally identify myself with the research that I do).

On one hand, I clearly can see the purpose and the benefits of both producing and consuming social media.  On the other hand, I continue to hold a number of reservations. During the presentation, I raised some questions about security, particularly regarding geo-location data.  (I have generally tried to avoid explicitly releasing my GPS co-ordinates to all and sundry, but I’m painfully aware that my phone might well be automatically doing this for me).  An interesting comment from our facilitator was, ‘I didn’t realise that there would be so much interest in security’.  This, to me, was surprising, since it was one of the concerns that I had in forefront my mind.

Although I did mention that I left the workshop early, I did feel that there was still perhaps more of an opportunity to talk about instance of good practice, i.e. examples of projects that made good use of social media to get their message out.  Our presenter gave many personal examples about reporting from war-torn countries and how he interviewed famous people, but I felt that these anecdotes were rather removed from the challenge of communicating about academic projects.

I can see there is clear value in knowing how to use different social media tools: they can be very useful way to get your message across, and when your main job is about education and generating new knowledge, there’s almost an institutional responsibility to share.  Doing so, it is argued, has the potential to allow others to discover your work (in the different forms it might take), and to ‘engineer serendipity’.

I came away with a couple of thoughts.  Firstly: would I be brave enough to ever create my own wobbly video or short audio podcasts about my research interests?  This would, in some way, mean exposing myself in a rough and ready and unedited way.  I’m comfortable within the world of text and blogs (since I can pretty much edit what I say), but I feel I need a new dimension of confidence to embrace a new dimension of multimedia. 

Two fundamental challenges to overcome include: getting used to seeing myself on video and getting used to my own voice on audio recordings.  I can figure out how to use technology without too many problems (I have no problems with using any type of gadget; after all, I can just do some searches on YouTube).  The bigger challenge is addressing the dimension of performance and delivery.  I’m also remember the phrase, ‘just because everyone can [make videos or audio recordings], doesn’t necessarily mean that everyone should’.

I’m also painfully aware that research stories need to be interesting and engaging if they are to have impact.  I’m assuming that because I’m thinking of this from the outset, this is a good thing, right?

I’ll certainly be looking at the toolkit again, but in the meantime, I’ll continue to think about (and play with) some of the tools I’ve been introduced (and reintroduced) to.  Much food for thought.

Permalink Add your comment
Share post
Christopher Douce

Professional Development Conference: London, 22 March 2014

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 12 Oct 2022, 09:00

The Open University in London runs two professional development conferences per year, one at its regional offices in Camden town, the other at the London School of Economics. Saturday 22 March was a busy day; it was the day I ran my first staff development session at this venue.  (I had previously run sessions in the Camden centre, but running a session in an external venue had, for some reason, a slightly different feel to it).

This blog post aims to summarise a number of key points from the session.  It is intended for anyone who might be remotely interested, but it’s mostly intended for fellow associate lecturers.  If you’re interested in the fine detail, or the contents of what was presented, do get in touch. Similarly, if you work within any other parts of the university and feel that this session might be useful for your ALs, do get in touch; I don’t mind travelling to other regions. 

Electronic assignments

The aim of the session was to share what I had discovered whilst figuring out how a tool called the ETMA file handler works.  Students with the university submit their assignments electronically through something called the Electronic Tutor Marked Assignment (ETMA) system.  This allows submissions to be held securely and the date and time of submission to be recorded.  It also allows tutors to collect (or download) batches of assignments that students have submitted.

When assignments are downloaded, tutors use a piece of software called the ETMA file handler.  This is a relatively simple piece of software that allows tutors to get an overview of which student has submitted which assignment.  It also allows tutors to see their work, allowing them to comment (and mark) what they have submitted.

There are three things that a tutor usually has to do.  Firstly, they have to assign a mark for a student’s submission.  They usually also have to add some comments to a script that has been submitted (which is usually in the form of a Microsoft Word document).  They also have to add some comments to help a student to move forward with their studies.  These comments are entered into a form that is colloquially known as a PT3.  Please don’t ask me why it’s called this; I have no idea – but it seems to be an abbreviation that is deeply embedded within the fabric of the university.  If you talk to a tutor about a PT3 form, they know what you’re talking about.

Under the hood

Given that the tutor marked assignments constitutes a pretty big part of the teaching and learning experience in the university, the ETMA file handler program is, therefore, a pretty important piece of software.  One of my own views (when it comes to software) is that if you understand how something works, you’ll be able to figure out how to use it better.

The intention behind my professional development session was to share something about how the ETMA file handler works, allowing tutors to carry out essential tasks such as make backups and move sets of marking from one computer to another.  Whilst the university does a pretty good at offering comprehensive training about how to use the file handler to enable tutors to get along with their job of marking, it isn’t so good at letting tutors know about how to do some of the system administration stuff that we all need to do from time to time, such as taking backups and moving files to another computer (hence my motivation to run this session).

One of my confessions is that I’m a computer scientist.  This means that I (sometimes) find it fun figuring out how stuff works.  This means that I sometimes mess around with a piece of software to see how to break it, and then try to get it working again.  (Sometimes I manage to do this, other times I don’t!)  During the session I focussed on a small number of things: how the file handler program knows about the assignments that have been downloaded (it uses directories), how directories are structured, what ‘special files’ these directories contains, and where (and how) additional information is held.

Here’s what I focussed on: the directories used to download files to, the directories used to return marked files from and how the file handler reads the contents of those directories so it is able to offer choices a tutor.  Towards the end of the presentation, I also presented a number of what I considered to be useful tips.  These were: the file hander software is very stupid, the file handler software needs to know where your marking is, form habits, be consistent, save files in the same place, use zip files to move files around, and be paranoid!

Reflections

Whilst I was writing the session, I thought to myself, ‘is this going to be too simple?’ and ‘surely everyone will get terribly bored with all this detail and all the geeky stuff that I’m going to be talking about?’  Thankfully, these fears were unfounded.  The detail, it turned out, seemed to be quite interesting.  Even if I was sharing the obvious, sometimes a shared understanding can offer some reassurance.

There were parts that went right, and other parts that went wrong (or, not so well as I had expected); both represented opportunities for learning.  The part that I almost got right was about timing.  I had an hour and a half to fill, and although the session had to be wrapped up pretty quickly (so everyone could get their sandwiches), the timing seemed to be (roughly) about right.

The part that I got wrong wasn’t something that was catastrophically wrong, but instead could be understood in terms of an opportunity to improve the presentation the next time round.  We all user our computers in slightly different ways, and I have to confess that I became particularly fixated in using my own computer in quite a needlessly complicated way (in terms of how to create and use backup files).  As a result, I now have slightly more to talk about, which I think is a good thing (but I might have to re-jig the timing).

There is one implicit side effect of sharing how something is either designed, or how something works.  When we know how something works, we can sometimes find new ways of working, or new ways to use the tools that we have at our disposal.  Whist probing a strange piece of software can be a little frightening it’s sometimes possible to find unexpected rewards.  We may never know what these are, unless we spend time doing this.

And finally…

If you’re an associate lecturer, do try to find the time to come to one of the AL development events; you’re always likely to pick something up from the day (and this applies as much to the facilitator as it does to the tutor too!)  As well as being useful, they can also be good fun too!

After the session had been completed, and the projectors and laptops were turned off, I started to ask myself a question.  This was: ‘what can I do for the next conference?’  Answering this question is now going to be one of my next tasks.

Permalink Add your comment
Share post
Christopher Douce

Associate Lecturer Professional Development Conference: Kent College, Tonbridge

Visible to anyone in the world
Edited by Christopher Douce, Monday, 24 Mar 2014, 14:14

The Open University in the South East ran one of their associate lecturer professional development conferences on the 1 March 2014.   This year, the conference was held at Kent College, Tonbridge.  I don’t know whether I wrote about this before, but this was the same where I attended my first ever OU tutorial (as a rookie tutor).  Today, the site is very different. Then it was gloomy and dark.  Now, the buildings are bright and airy, and boasted a spectacular view of the Kent countryside.

This post is a very brief summary of the event.  The summary has drawn directly from the notes that I made during the day (and these, by definition, will probably contain a couple of mistakes!)  It also contains a bunch of rough reflections.  I should add that this blog is primarily intended for other associate lecturer colleagues but it might accidentally be of wider interest to others too.

During this conference, I signed up for two sessions.  The first was entitled, ‘supporting academic writing’.  The second session was all about, ‘aligning TMA feedback to students’ needs and expectations’. 

Supporting academic writing

This first session was facilitated by Anna Calvi, who projected a set of phrases about academic writing onto a digital whiteboard.  A couple of examples were, ‘what is a semi-colon?’ and ‘I think of ideas and information as I write’. ‘Do any of you recognise these?  Which are the most important for you?’ Anna asked, challenging us to respond.  She didn’t have to wait long for an answer.

A couple of responses that I noted down were: explaining why structure is important, the importance of paraphrasing and differences between written English and spoken English.  There’s also the necessity to help students to understand what is meant by ‘written academic English’.  Some suggestions were immediately forthcoming: the choice of vocabulary, style and appropriate referencing.

One of the participants asked a question that I have heard asked before.  This was, ‘can all faculties have a module that helps students to write descriptively?’  The truth of the matter is that different faculties do different things.  In the Mathematics Computing and Technology module, writing skills are embedded (and emphasised) within the introductory level 1 modules.  Other faculties have dedicated modules.  Two key modules are LB160 Professional Communication Skills for Business Studies, and L185 English for Academic purposes, which I understand can contribute credit to some degree programmes.

During this session, all the tutors were directed towards other useful resources.  These include a useful student booklet entitled reading and taking notes (PDF) which is connected to an accompanying Skills for Study website (OU website).  Another booklet is entitled Thinking Critically (PDF).  This one is particularly useful, since terms, ‘analyse critically’ and ‘critically evaluate’ can (confusingly) appear within module texts, assignments and exams.

One of the points shared during this first session was really important: it’s important to emphasise what academic writing is right at the start of a programme of study.

What needs to be done?

So, how can tutors help?  Anna introduced us to a tool known as the MASUS framework.  MASUS is an abbreviation for Measuring Skills of Academic Students and has originally come from the University of Sydney.  We were directed to a video (OU website) which describes what the framework is and how it works.  A big part of the framework (from what I remember), is a checklist for academic writing (OU website).  In essence, this tool helps us (tutors) to understand (or think about) what kind of academic writing support students might need.  Key areas can include the use of source materials (choosing the right ones), organising a response in an appropriate way, using language that is appropriate to both the audience and the task, and so on.  In some respects, the checklist is an awareness raising tool.  The tutor’s challenge lies in how to talk to students about aspects of writing.

If you’re interested, there’s a more comprehensive summary of the MASUS framework (PDF) is available directly from the University of Sydney.  Another useful resource is the OU’s own Developing academic English which tutors can refer students to.  We were also directed to an interesting external resource, a Grammar tutorial, from the University of Bristol.

Offering feedback

After looking at the checklist and these resources we moved onto a wider discussion about how best tutors can help students to develop their academic writing.  I’ve made a note of two broad approaches; one is reactive, the other is proactive. A reactive strategy might include offering general backward looking feedback and perhaps running a one to one session with a student.  A proactive approach, on the other hand, could include discussions through a tutor group forum, activities within tutorials, sharing of hand outs that contain exercises and practical feed-forward advice within assignments that have been returned.

TMA feedback can, for example, give examples (or samples) of what is considered to be effective writing.  An important point that emerged from the discussions was that it is very important to be selective, since commenting on everything can be very overwhelming.  One approach is to offer a summary and provide useful links (and pointers) to helpful resources.

On-line tutorials

Anna moved onto the question of what tutors might (potentially) do within either face to face or on-line tutorials to help students with their academic writing; this was the part of the sessions where tutors had an opportunity to share practice with each other.  Anna also had a number of sample activities that we could either use, modify, or draw teaching inspiration from.

The first example was an activity where students had to choose key paragraphs from a piece of writing.  Students could then complete a ‘diagram’ to identify (and categorise) different parts (or aspects of an argument).  Another activity might be to ask students to identify question words, key concepts and the relationships between them. 

Further ideas include an activity to spot (or identify) parts of essay, such as an introductory sentences, background information, central claims and perhaps a conclusion.  A follow on activity might be to ask questions about purpose of each section, then connecting with a discussion to the tasks that are required for an assignment.

There was also a suggestion of using some cards.  Students could be asked to match important terms written on cards to paragraphs. Terms could include: appropriate tone, formality, alternative views, vocabulary, linking words, and so on.  There would also be an opportunity to give examples, to allow tutors to emphasise the importance of writing principles.

A further tip was to search the OpenLearn website for phrases such as ‘paraphrasing’ (or module codes, such as L185) for instance.  The OpenLearn site contains some very useful fragments of larger courses which might be useful to direct students to.

Aligning TMA feedback to students’ needs and expectations

This second session was facilitated by Concha Furnborough.  Her session had subheading of, ‘how well does our feedback work?’ which is a very important question to ask.  It soon struck me that this session was about the sharing of research findings with the intention of informing (and developing) tutor practice.

I’ve made a note of another question: how do we bridge the gap between actual and desired performance.  Connecting back to the previous session, a really important principle is to offer ‘feed-forward’ comments, which aims to guide future altering behaviour. 

An early discussion point that I noted was that some students don’t take the time to download their feedback (after they have discovered what their assignment marks were).  We were all reminded that we (as tutors) really need to take the time to make sure students download the feedback that they are entitled to receive.

This session describes some of the outcomes from a project called eFeP, which is an abbreviation for e-Feedback evaluation project, funded by Jisc (which support the use of digital technologies in education and research).  If you’re interested, more information about the project is available from the eFePp project website (Jisc).

The aim of the project was to understand the preferences and perceptions that students have about the auditory and written feedback that are offered by language tutors.  The project used a combination of different techniques.  Firstly, it used a survey.  The survey was followed by a set of interviews.  Finally, ten students were asked to make a screen-cast recording; students were asked to talk through their responses to the feedback and guidance offered by their tutors.

One of the most interesting parts of the presentation (for me) was a description of a tool known as ‘feedback scaffolding’.  The ‘scaffolding’ corresponds to the different levels or layers of feedback that are offered to students.  The first level relates to a problem or issue that exists in an assignment.  Level two relates to an identification of the type of error.  If we’re thinking in terms of language teaching, this might be the wrong word case (or gender) being applied.  The third level is where an error is corrected.  The fourth is where an explanation is given, and the fifth is clear advice on how performance might be potentially improved.

Feeling slightly disruptive, I had to ask a couple of questions.  Firstly, I asked whether there was a category where tutors might work to contextualise a particular assignment or question, i.e. to explain how it relates to the subject as a whole, or to explain why a question is asked by a module team.  In some respects, this can fall under the final category, but perhaps not entirely.

My second question was about when in their learning cycle students were asked to comment on their feedback.  The answer was that they gave their feedback once they had taken the time to read through and assimilate the comments and guidance that the tutors had offered.   Another thought would be to capture how feedback is understood the instant that it is received by a learner.  (I understand that the researchers have plans to carry out further research).

If anyone is interested, there is a project blog (OU website), and it’s also possible to download a copy of a conference paper about the research from the OU’s research repository.

Reflections

Even though I attended only two sessions, there was a lot to take in.  One really interesting point was to hear different views about the challenges of academic writing from different people who work in different parts of the university.  I’ve heard it said that academic writing (of the type of writing needed to complete TMAs) is very tough if you’re doing it for the first time.  In terms of raising awareness of different resources that tutors could use to help students, the first session was especially useful.

These conferences are not often used to disseminate research findings, but the material that was covered in the second session was especially useful.  It exposed us to a new feedback framework (that I wasn’t aware of), and secondly, it directly encouraged us to consider how our feedback is perceived and used.

One of the biggest benefits of these conferences is that they represent an opportunity to share practices.  A phrase that I’ve often heard is, ‘you always pick up something new’.

Copies of the presentations used during the conference can be found by visiting the South East Region conference resources page (OU website, staff only).

Footnote

A week after drafting this summary, I heard that the university plans to close the South East regional centre in East Grinstead.  I started with the South East region back in 2006, and it was through this region that I began my career as an associate lecturer.

All associate lecturers are offered two days of professional development as their contract, and the events that the region have offered have helped to shape, inform and inspire my teaching practice.  Their professional development events have helped me to understand how to run engaging tutorials, my comfort zone has also been thoroughly stretched through inspiring ‘role play’ exercises, and I’ve also been offered exceptional guidance about how to provide effective correspondence tuition.

Without a doubt, the region has had a fundamental and transformative effect on how I teach and has clearly influenced the positive way that I view my role as an associate lecturer.  The professional development has always been supportive, respectful and motivating.

The implications on the closure of the South East region on continuing professional development for both new and existing tutors is currently unclear.  My own view is probably one this obvious: if these rare opportunities for sharing and learning were to disappear, the support that the university offer its tutors would be impoverished.

Permalink Add your comment
Share post
Christopher Douce

e-Learning community event: mobile devices

Visible to anyone in the world
Edited by Christopher Douce, Thursday, 20 Feb 2014, 12:01

Mobile devices are everywhere.  On a typical tube ride to the regional office in London, I see loads of different devices.  You can easily recognise the Amazon Kindle; you see the old type with buttons, and the more modern version with its touch screen.  Other passengers read electronic books with Android and Apple tablets.  Other commuters study their smart phones with intensity, and I’m fascinated with what is becoming possible with the bigger screen phones, such as the Samsung Note (or phablets, as I understand they’re called).  Technology is giving us both convenience and an opportunity to snatch moments of reading in the dead time of travel.

I have a connection with a module which is all about accessible online learning (H810 module description).  In the context of the module, accessibility is all about making materials, products and tools usable for people who have disabilities.  Accessibility can also be considered in a wider sense, in terms of making materials available to learners irrespective of their situation or environment.  In the most recent presentation of H810, the module team has made available much of the learning materials in eBook or Kindle format.  The fact that materials can be made available in this format can be potentially transformative and open up opportunities to ‘snatch’ more moments of learning.

An event I attended on 11 February 2014, held in the university library, was all about sharing research and practice about the use of mobile devices.  I missed the first presentation, which was all about the use of OU Live (an on-line real time conferencing system) using tablet devices.  The other two presentations (which I’ve made notes about) explored two different perspectives: the perspective of the student, and the perspective of the associate lecturer (or tutor).

(It was also interesting to note that the event was packed to capacity; it was standing room only.  Mobile technology and its impact on learning seems to be a hot topic).

Do students study and learn differently using e-readers?

The first presentation I managed to pay attention to was by Anne Campbell who had conducted a study about how students use e-readers.  Her research question (according to my notes) was whether users of these devices could perform deep reading (when you become absorbed and immersed in a text) and active learning, or alternatively, do learners get easily distracted by the technology?  Active learning can be thought of carrying out activities such as highlighting, note taking and summarising – all the things that you used to be able to do with a paper based text book and materials.

Anne gave us a bit of context.  Apparently half of OU postgraduate students use a tablet or e-reader, and most use it for studying.  Also, half of UK households have some kind of e-reader.  Anne also told us that there was very little research on how students study and learn using e-readers.  To try to learn more, Anne has conducted a small research project to try to learn more about how students consume and work with electronic resources and readers.

The study comprised of seventeen students.  Six students were from the social sciences and eleven students were studying science.  All were from a broad range of ages.  The study was a longitudinal diary study.  Whenever students used their devices, they were required to make an entry.  This was complemented with a series of semi-structured interviews.  Subsequently, a huge amount of rich qualitative data was collected and then analysed using a technique known as grounded theory.   (The key themes and subjects that are contained within the data are gradually exposed by looking at the detail of what the participants have said and have written).

One of the differences between using e-readers and traditional text books is the lack of spatial cues.  We’re used to the physical size of a book, so it’s possible to (roughly) know where certain chapters are once we’re familiar with its contents.  It’s also harder to skim read with e-readers, but on the other hand this may force readers to read in more depth.  One comment I’ve noted is, ‘I think with the Kindle… it is sinking in more’.  This, however, isn’t true for all students.

I’ve also noted that there clear benefits in terms of size.  Some text books are clearly very heavy and bulky; you need a reasonably sized bag to move them around from place to place, but with an e-reader, you can (of course) transfer all the books that you need for a module to the device.  Other advantages are that you can search for key phrases using an e-reader.  I’ve learnt that some e-readers contain a built in dictionary (which means that readers can look up words without having to reach for a paper dictionary).  Other advantages include a ‘clickable index’ (which can help with the navigation).  Other more implicit advantages can include the ability to change the size of the text of the display, and the ability to use the ‘voice readout’ function of a mobile device (but I don’t think any participants used this feature).

I also noted that e-readers might not be as well suited for active learning for the reasons that I touched on above, but apparently it’s possible to perform highlights and to record notes within an ebook.

My final note of this session was, ‘new types of study advice needed?’   More of this thought later.

Perspectives from a remote and rural AL

Tamsin Smith, from the Faculty of Science, talked about how mobile technology helps her in her role as an associate lecturer.  I found the subject of this talk immediately interesting and was very keen to hear learn about Tamsin’s experiences.  One of the modules that Tamsin tutors on consists of seven health science books.  The size and convenience of e-readers can also obviously benefit tutors as well as students.

On some modules, key documents such as assignment guides or tutor notes are available as PDFs.  If they’re not directly available, they can be converted into PDFs using freely available software tools.  When you have got the documents in this format, you can access them using your device of choice.  In Tamsin’s case, this was an iPad mini. 

On the subject of different devices, Tamsin also mentioned a new app called OU Anywhere, which is available for both iOS and Android devices.  After this talk, I gave OU Anywhere a try, downloading it to my smartphone.  I soon saw that I could access all the core blocks for the module that I tutor on, along with a whole bunch of other modules.  I could also access videos that were available through the DVD that was supplied with the module.  Clearly, this appeared to be (at a first glance) pretty useful, and was something that I needed to spend a bit more time looking at.

Other than the clear advantages of size and mobility, Tamsin also said that there were other advantages.  These included an ability to highlight sections, to add notes, to save bookmarks and to perform searches.  Searching was highlighted as particularly valuable.  Tutors could, for example, perform searches for relevant module materials during the middle of tutorials. 

Through an internet connection, our devices can allow access to the OU library, on line tutorials through OU Live (as covered during the first presentation that I missed), and tutor group discussion forums allowing tutors to keep track of discussions and support students whilst they’re on the move.  This said, internet access is not available everywhere, so the facility to download and store resources is a valuable necessity.  This, it was said, was the biggest change to practice; the ability to carry all materials easily and access them quickly. 

One point that I did learn from this presentation is that there is an ETMA file handler that available for the iPad (but not one that is official sanctioned or supported by the university).

Final thoughts

What I really liked about Anne’s study was its research approach.  I really liked the fact that it used something called a diary study (which is a technique that is touched on as a part of the M364 Interaction Design module).  This study aimed to learn how learning is done.  It struck me that some learners (including myself) might have to experiment with different combinations of study approaches and techniques to find out what works and what doesn’t.  Study technique (I thought) might be a judgement for the individual.

When I enrolled on my first postgraduate module with the Open University, I was sent a book entitled, The Good Study Guide by Andrew Northedge (companion website).  It was one of those books where I thought to myself, ‘how come it’s taken me such a long time to get around to reading this?’, and, ‘if only I had read this as an undergraduate, I might have perhaps managed to get a higher score in some of my exams’.  It was packed filled with practical advice about topics as time management, using a computer to study, reading, making notes, writing and preparing for exams.

It was interesting to hear from Anne’s presentation that studying using our new-fangled devices is that little bit different.  Whilst on one hand we lose some of our ability to put post it notes between pages and see where our thumbs have been, we gain mobility, convenience and extra facilities such as searching. 

It is very clear that more and more of university materials can now be accessed using electronic readers.  Whilst this is likely to be a good thing (in terms of convenience), there are two main issues (that are connected to each other) that I think that we need to bear in mind. 

The first is a very practical issue.  It is: how do you get the materials onto our device?  Two related questions are: how can we move our materials between different devices? and, how do we effectively manage the materials once we have saved them to our devices?  We might end up downloading a whole set of different files, ranging from different module blocks, assignments and other guidance documents.  It’s important to figure out a way to best manage these files:  we need to be literate in how we use our devices.   (As an aside, these questions loosely connect with the nebulous concept of the Personal Learning Environment).

The second issue relates to learning.  In the first presentation, Anne mentioned the term ‘active learning’.  The Good Study Guide contains a chapter about ‘making notes’.  Everyone is different, but I can’t help but think that there’s an opportunity for ‘practice sharing’.  What I mean is that there’s an opportunity to share stories of how learners can effectively make use of these mobile devices, perhaps in combination with more traditional approaches for study (such as note taking and paraphrasing).  Sharing tips and tricks about how mobile devices can fit into a personalised study plan has the potential to show how these new tools can be successfully applied.

A final thought relates to the broad subject of learning design.  Given that half of all households now have access to e-readers of one form or another (as stated in the first presentation I’ve covered) module teams need to be mindful of the opportunities and challenges that these devices can offer.  Although this is slightly away from my home discipline and core subject, I certainly feel that there needs to be work to be done to further understand what these challenges and opportunities might be.  I’m sure that there has been a lot more work carried out than I am aware of.  If you know of any studies that are relevant, please feel free to comment below.

Video recordings of these presentations are available through the university Stadium website.

Permalink 1 comment (latest comment by Jonathan Vernon, Wednesday, 5 Mar 2014, 23:38)
Share post
Christopher Douce

Gresham College: Designing IT to make healthcare safer

Visible to anyone in the world

On 11 February, I was back at the Museum of London.  This time, I wasn’t there to see juggling mathematicians (Gresham College) talking about theoretical anti-balls.  Instead, I was there for a lecture about the usability and design of medical devices by Harold Thimbleby, who I understand was from Swansea University. 

Before the lecture started, we were subjected to a looped video of a car crash test; a modern car from 2009 was crashed into a car built in the 1960s.  The result (and later point) was obvious: modern cars are safer than older cars.  Continual testing and development makes a difference.  We now have substantially safer cars.  Even though there have been substantial improvements, Harold made a really interesting point.  He said, ‘if bad design was a disease, it would be our 3rd biggest killer’.

Computers are everywhere in healthcare.  Perhaps introducing computers (or mobile devices) might be able to help?  This might well be the case, but there is also the risk that hospital staff might end up spending more time trying to get technology to do the right things rather than spending other time dealing with more important patient issues.  There is an underlying question of whether a technology is appropriate or not.

This blog post has been pulled directly from my notes that I’ve made during the lecture.  If you’re interested, I’ve provided a link to the transcript of the talk, which can be found at the end.

Infusion pumps

Harold showed us pictures of a series of infusion pumps.  I didn’t know what an infusion pump was.  Apparently it’s a device that is a bit like an intravenous drip, but you program it to dispense a fluid (or drug) into the blood stream at a certain rate.  I was very surprised by the pictures: every infusion pump looked very different to each other and these differences were quite shocking.  They each had different screens and different displays.  They were different sizes and had different keypad layouts.  It was clear that there was little in the way of internal and external consistency. Harold made an important point, that they were ‘not designed to be readable, they were designed to be cheap’ (please forgive my paraphrasing here).

We were regaled with further examples of interaction design terror.  A decimal point button was placed on an arrow key.  It was clear that there was not appropriate mapping between a button and its intended task.  Pushing a help button gave little in the way of help to the user.

We were told of a human factors analysis study where six nurses were required to use an infusion pump over a period of two hours (I think I’ve noted this down correctly).  The conclusion was that all of the nurses were confused.  Sixty percent of the nurses needed hints on how to use the device, and a further sixty percent were confused by how the decimal point worked (in this particular example).  Strikingly, sixty percent of those nurses entered the wrong settings.  

We’re not talking about trivial mistakes here; we’re talking about mistakes where users may be fundamentally confused by the appearance and location of a decimal point.   Since we’re also talking about devices that dispense drugs, small errors can become life threateningly catastrophic.

Calculators

Another example of devices where errors can become significant is the common hand-held calculator.  Now, I was of the opinion that modern calculators were pretty idiot proof, but it seems that I might well be the idiot for assuming this.  Harold gave us an example where we had to try to simply calculate percentages of the world population.  Our hand held calculator simply threw away zeros without telling us, without giving us any feedback.  If we’re not thinking, and since we implicitly know that calculators carry out calculations correctly, we can easily assume that the answer is correct too.  The point is clear:  ‘calculators should not be used in hospitals, they allow you to make mistakes, and they don’t care’.

Harold made another interesting point: when we use a calculator we often look at the keypad rather than the screen.  We might have a mental model of how a calculator works that is different to how it actually responds.   Calculators that have additional functions (such as a backspace, or delete last keypress buttons) might well break our understanding and expectations of how these devices operate.  Consistency is therefore very important (along with the visibility of results and feedback from errors).

There’s was an interesting link between this Gresham lecture and the lecture by Tony Mann (blog summary), which took place in January 2014.  Tony made the exact same point that Harold did.  When we make mistakes, we can very easily blame ourselves rather than the devices that we’re using.  Since we hold this bias, we’re also reluctant to raise concerns about the usability of devices and the equipment that we’re using.

Speeds of Thinking

Another interesting link was that Harold drew upon research by Daniel Kahneman (Wikipedia), explicitly connecting the subject of interface design with the subject of cognitive psychology.  Harold mentioned one of Kahneman’s recent books entitled: ‘Thinking Fast and Slow’, which posits that there are two cognitive systems in the brain: a perceptual system which makes quick decisions, and a slower system which makes more reasoned decisions (I’m relying on my notes again; I’ve got Daniel’s book on my bookshelves, amidst loads of others I have chalked down to read!)

Good design should take account of both the fast and the slow system.  One really nice example was with the use of a cashpoint to withdraw money from your bank account.  Towards the end of the transaction, the cashpoint begins to beep continually (offering perceptual feedback).  The presence of the feedback causes the slower system to focus attention on the task that has got to be completed (which is to collect the bank card).   Harold’s point is simple: ‘if you design technology properly we can make the world better’.

Visibility of information

How do you choose one device or product over another?  One approach is to make usually hidden information more visible to those who are tasked with making decisions.  A really good example of this is the energy efficiency ratings on household items, such as refrigerators and washing machines.  A similar rating scheme is available on car tyres too, exposing attributes such as noise, stopping distance and fuel consumption.  Harold’s point was: why not create a rating system for the usability of devices?

Summary

The Open University M364 Fundamentals of Interaction Design module highlights two benefits of good interaction design.  These are: an economic arguments (that good usability can save time and money), and safety.

This talk clearly emphasised the importance of the safety argument and emphasised good design principles (such as those created by Donald Norman), such as visibility of information, feedback of action, consistency between and within devices, and appropriate mapping (which means that buttons that are pressed should do the operation that they are expected to do).

Harold’s lecture concluded with a number of points that relate to the design of medical devices.  (Of which there were four, but I’ve only made a note of three!)  The first is that it’s important to rigorously assess technology, since this way we can ‘smoke out’ any design errors and problems (evaluation is incidentally a big part of M364).  The second is that it is important to automate resilience, or to offer clear feedback to the users.  The third is to make safety visible through clear labelling.

It was all pretty thought provoking stuff which was very clearly presented.  One thing that struck me (mostly after the talk) is that interactive devices don’t exist in isolation – they’re always used within an environment.  Understanding the environment and the way in which communications occur between different people who work within that environment are also considered to be important too (and there are different techniques that can be used to learn more about this).

Towards the end of the talk, I had a question that someone else asked.  It was, ‘is it possible to draw inspiration from the aviation industry and apply it to medicine?’  It was a very good question.  I’ve read (in another OU module) that an aircraft cockpit can be used as a way to communicate system state to both pilots.  Clearly, this is subject of on-going research, and Harold directed us to a site called CHI Med (computer-human interaction).

Much food for thought!  I came away from the lecture feeling mildly terrified, but one consolation was that I had at least learnt what an infusion pump was.  As promised, here’s a link to the transcript of the talk, entitled Designing IT to make healthcare safer (Gresham College). 

Permalink Add your comment
Share post
Christopher Douce

Bletchley Park archive course

Visible to anyone in the world

At end of January, I took a day off my usual duties and went to an event called the ‘Bletchley Park archive course’.  I heard about the course through the Bletchley Park emailing list.  As soon as I received the message telling me about it I contacted the organisers straight away, but unfortunately, I was already too late: there were no longer any spaces on the first event.  Thanks to a kind hearted volunteer, I was told about the follow up event.

This blog post is likely to be a number of blog posts about Bletchley Park, a place that is significant not only in terms of Second World War intelligence gathering and analysis, but is also significant in the history of computing.   It’s a place I’ve been to a couple of times, but this visit had a definite purpose; to learn more about their archives and what they might be able to tell a very casual historian of technology, like myself.

I awoke at about half six in the morning, which is the usual time when I have to travel to Milton Keynes and found my way to my local train station.  The weather was shocking, as it was for the whole of January.  I was wearing sturdy boots and had donned a raincoat, as instructed by the course organisers.  Two trains later, I was at Euston Station, ready to take the relatively short journey north towards Milton Keynes, and then onto the small town of Bletchley, just one stop away.

Three quarters of an hour later, after walking through driving rain and passing what appeared to be a busy building site, I had found the room where the ‘adult education’ course was to take place.

Introduction and History

The day was hosted by Bletchley Park volunteer, Susan Slater.  Susan began by taking about the history of the site that was to ultimately become a pivotal centre for wartime intelligence.  Originally belonging to a financier, the Bletchley Park manor house and adjoining lands were put up for auction in 1937. 

Bletchley was a good location; it was pretty incongruous.  It was also served by two railway lines.  One line that went to London and another that went from East to West, connecting the universities of Oxford and Cambridge.  Not only was it served well in terms of transport, the railway also offers other kinds of links too – it was possible to connect to telecommunication links that I understand ran next to the track.  Importantly, it was situated outside of London (and away from the never ending trials of the blitz).

Susan presented an old map and asked us what we thought it was.  It turned out to be a map of the telegraph system during the time of the British Empire; red wires criss-crossed the globe.  The telegraph system can be roughly considered to be a ‘store and forward’ system.  Since it was impossible (due to distances involved) to send a message from England to, say, Australia, directly, messages (sent in morse code) were sent via a number of intermediate stations (or hubs). 

Susan made the point that whoever ran the telecommunication hubs were also to read all the messages that were transferred through it.  If you want your communications to be kept secret, the thing to do is to encode them in some way.  Interestingly, Susan also referred to Edward II, where there was a decree in around 1324 (if I understand this correctly!) that stated ‘all letters coming from or going to parts overseas [could] be ceased’.  Clearly, the contemporary debates about the interception of communications have very deep historical roots.

We were introduced to some key terms.  A code is a representation of letters and words by other letters and words.  A cypher is how letters are replaced with other letters.  I’ve also noted that if that if something is formulaic (or predicable), then it can become breakable (which is why you want to hide artefacts of language - certain characters in a language are statistically more frequent than others, for example).  The most secure way to encode a message is to use what a one-time pad (Wikipedia).  This is an encoding mechanism that is used only once and then thrown away.

An Engima machine (Wikipedia), which sat at the front of the classroom, was an electro-mechanical implementation of an encoding mechanism.  Susan outlined its design to us: it had a keyboard like a typewriter, plug boards (to replace one letter with another), four or five rotors that had the same number of positions as there were characters (which moved every time you pressed a key), and wiring within the rotors that changed the ‘letters’ even further. 

Second session: how it all worked

After a swift break, we dived straight into the second session, where we were split into two teams.  One team had to encrypt a message (using the Enigma machine), and the second team had to use the same machine to decrypt the same message (things were made easier since the ‘decrypting side’ knew what all the machine settings were).   I think my contribution was to either press a letter ‘F’ or a letter ‘Q’ – I forget!  Rotors turned and lights lit up.  The seventy-something year old machine still did its stuff.

What follows is are some rough notes from my notebook (made quickly during the class).  We were told that different parts of the German military used different code books (and also the Naval enigma machine was different to other enigma machines).  Each code book lasted for around 6 weeks.  The code book contained information such as the day, rotor position, starting point of the rotor and plug board settings; everything you needed to make understandable messages totally incomprehensible.

The challenge was, of course, to uncover what the settings of an Engima machine were (so messages could be decrypted).  A machine called the Bombe (Wikipedia) was invented to help with the process of figuring what the settings might be.  When the settings were (potentially) uncovered, these were tested by entering them into a machine called the Typex (which was, in essence, a version of an Enigma machine) along with the original message, to see if plain text (an unencrypted message) appeared.

The Enigma wasn’t the only machine that was used to encrypt (and decrypt) messages. Enigma (as far as I understand) was used for tactical communications.  Higher level strategic communications used in the German high command were transmitted using the Lorenz cypher.  This more complicated machine contained a paper tape reader which allowed the automatic transmission of messages, dispensing with the need for a morse code operator.

In terms of the scale of the operation at Bletchley Park, we were told that three thousand Engima messages ever day were being decoded, and forty Lorenz messages.  To help with this, there were 210 Bombe machines to help with the Enigma codes, and a machine that is sometimes described as ‘the world’s first electronic computer’, the Colossus machine.  At its peak, there were apparently ten thousand workers (a quarter of whom were women), running three shifts. 

Bombe Demo

After a short break, we were gently ushered downstairs to one of the museum exhibits; a reconstruction of a Bombe machine.  This was an electro-mechanical device that ‘sped up’ the process of discovering Enigma machine settings.  Two operators described how it worked and then turned it on.  It emitted a low whirring and clicking noise as it mechanically went through hundreds of combinations.

As the Bombe was running, I had a thought.  I wondered how you might go about writing a computer program, or a simulation to do pretty much the same thing.  The machine operators talked about the use of something called a ‘code map’, which helped them to find the route towards the settings.  I imagined an application or interactive smartphone or tablet app that allowed you to play with your own version of a Bombe, to get a feel for how it would work...  There could even be virtual Enigma machine that you could play with; you could create a digital playground for budding cryptographers.

Of course, there’s no such thing as an original thought: a Bombe simulator has already been written by the late Tony Sale (who reconstructed the Colossus machine), and a quick internet search revealed a bunch of Engima machine simulators.  One burning question is how might we potentially make the best use of these tools and resources?

Archive Talk

The next part of the day was all about the archive; the real reason I signed up for this event.  I have to confess that I didn’t really know what to expect and this sense of uncertainty was compounded by having a general interest rather than having a very specific research question in mind.

The archive is run by the Bletchley Park Trust.  GCHQ, the Government Communication Headquarters, is the custodian for the records that have come from Bletchley Park.  I understand that GCHQ is going to use Bletchley Park is used as its ‘reading room’, having leant around one hundred and twenty thousand documents for a period of fifty years.

By way of a very general introduction, a number of samples from the archive were dotted around our training room.  These ranged from Japanese language training aids (and a hand-written Japanese-English dictionary), forms used to help with the decryption of transmissions, through to samples of transmissions that were captured during the D-Day landings.

Apparently, there’s a big project to digitise the archive.  There is a multi-stage process that is under way.  The first stage is to have the artefacts professionally photographed.  This is followed by (I believe) storing the documents in some kind of on-line repository.  Volunteers may then be actively needed to help create metadata (or descriptions) of each repository item, to enable them to be found by researchers.

Tour

The final part of the day was a tour.  As I mentioned earlier, I’ve been on a couple of Bletchley Park tours, but this was unlike any of the earlier tours I had been on before.  We were all given hard hats and told to don high visibility jackets.  We were then ushered into the driving rain.

After a couple of minutes of trudging, we arrived at a building that I had first seen when I entered the site.  The building (which I understand was known as ‘hut 3’) was to become a new visitor’s centre.  From what I remember, the building used to be one of the largest punched card archives in Europe, known as Deb’s delight (for a reason that completely escapes me).    It was apparently used to cross-reference stuff (and I’m writing in terrible generalisations here, since I really don’t know very much!) 

Inside, there was no real lighting and dust from work on the floors hung in the air.  There was a strong odour of glue or paint.  Stuff was clearly happening.  Internal walls had been stripped away to give way to reveal a large open plan area which would become an ideal exhibition space.  Rather than being a wooden prefabricated ‘hut’, we were walking through a substantial brick building. 

Minutes later, we were directed towards two other huts that were undergoing restoration.  These were the wooden ones.  It was obvious that these buildings had lacked any kind of care and attention for many years, and workmen were busy securing the internal structure.  Avoiding lights and squeezing past tools, we snaked through a series of claustrophobic corridors, passing through what used to be the Army Intelligence block and then onto the Navy Intelligence block.  These were the rooms in which real secrets became clear.   Damp hung in the air, and mould could be seen creeping up some of the old walls.  There was clearly a lot of work that needed to be done.

Final thoughts

Every time I visit Bletchley Park, I learn something new.  This time, I became more aware of what happened in the different buildings, and I certainly learnt more about the future plans for the archive.  Through the talks that took place at the start of the day, I also learnt of a place called the Telegraph museum (museum website), which can be found at Porth Curno, Cornwall.   When walking through the various corridors to the education room, I remember a large poster that suggested that all communication links come to Bletchley Park, and that Bletchley is the centre of everything.

When it comes to a history of computing, it’s impossible to separate out the history of the computer and the history of telecommunications.  In Bletchley Park, communications and computing are fundamentally intertwined.  There’s another aspect, which is computing (and computing power) has led to the obvious development of new forms of communication.  Before I go any further forward in time (from, say, 1940 onwards), there’s a journey that I have to make back in time, and that is to go on a diversion to discover more about telecommunications, and a good place to start is by learning more about the history of the telegraph system.

I’ll be back another day (ideally when it’s not raining), to pay another call to Bletchley Park, and will also drop into to The National Museum of Computing, which occupies the same site.

Permalink 1 comment (latest comment by Rebecca Kowalski, Thursday, 13 Feb 2014, 14:52)
Share post
Christopher Douce

Gresham College Lecture: Notations, Patterns and New Discoveries (Juggling!)

Visible to anyone in the world

On a dark winter’s evening on 23 January 2014, I discovered a new part of London I had never been to before.  Dr Colin Wright gave a talk entitled ‘notations, patterns and new discoveries’ at the Museum of London.   The subject was intriguing in a number of different ways.  Firstly, it was all about the mathematics of juggling (which represented a combination of ideas that I had never come across before).  Secondly, it was about notations.

 The reason why I was ‘hooked’ by the notation part of the title is because my home discipline is computer science.  Computers are programmed using notation systems (programming languages), and when I was doing some research into software maintenance and object-oriented programming I discovered a series of fascinating papers that was about something called the ‘cognitive dimensions of notations’.  Roughly put, these were all about how we can efficiently work with (and think about) different types of notation system.

In its broadest sense, a notation is an abstraction or a representation.  It allows us to write stuff down.  Juggling (like dance) is an activity that is dynamic, almost ethereal; it exists and time and space, and then it can disappear or stop in an instant.  Notation allows us to write down or describe the transitory.  Computer programming languages allow us to describe sets of invisible instructions and sequences of calculations that exist nowhere except within digital circuits.  When we’re able to write things down, it turns out that we can more easily reason about what we’ve described, and make new discoveries too.

It took between eight and ten minutes to figure out how to get into the Museum of London.  It sits in the middle of a roundabout that I’ve passed a number of times before.  Eventually, I was ushered into a huge cavernous lecture theatre, which clearly suggested that this was going to be quite ‘an event’.  I was not to be disappointed.

Within minutes of the start of the lecture, we heard names of famous mathematicians: Gauss and Liebniz.  One view was that ‘truths (or proofs) should come from notions rather than notations’.  Colin, however, had a different view, that there is interplay between notions (or ideas) and notations.

During the lecture, I made a note of the following sentence: a notation represents a ‘specialist terminology allows rapid and accurate communication’, and then moved onto ask the question, ‘how can we describe a juggling pattern?’  This led to the creation of an abstraction that could then describe the movement of juggling balls. 

Whilst I was listening, I thought, ‘this is exactly what computer programmers do; we create one form of notation (a computer program), using another form of notation (a computer language) – the computer program is our abstraction of a problem that we’re trying to solve’.  Colin introduced us to juggling terms (or high level abstractions), such as the ‘shower’, ‘cascade’ and ‘mill’s mess’.  This led towards the more intellectually demanding domain of ‘theoretical juggling’ (with impossible number of balls).

 My words can’t really do the lecture justice.  I should add that it is one of those lectures that you would learn stuff by listening to it more than once.  Thankfully, for those who are interested, it was recorded, and it available on-line (Gresham College)

Whilst I was witnesses all these great tricks, one thought crossed my mind, which was, ‘how much time did you have to spend to figure out all this stuff and to learn all these juggling tricks?!  Surely there was something better you could have done with your time!’ (Admittedly, I write this partially in jest and with jealousy, since I can’t catch and I fear that doing ‘a cascade’ with three balls is, for me, a theoretical impossibility). 

It was a question that was implicitly answered by considering the importance of pure mathematics.  Doing and exploring stuff only because it is intellectually interesting may potentially lead to a real world practical use – the thing is that you don’t know what it might be and what new discoveries might emerge.  (A good example of this is number theory leading to the practical application of cryptography, which is used whenever we buy stuff over the internet). 

All in all, great fun.  Recommended.

Permalink Add your comment
Share post
Christopher Douce

Gresham College Lecture: User error – why it’s not your fault

Visible to anyone in the world

On 20 January 2014 I found the time to attend a public lecture in London that was all about usability and user error. The lecture was presented by Tony Mann, from the University of Greenwich.  The event was in a group of buildings just down the street from Chancery Lane underground station.  Since I was keen on this topic, I arrived twenty minutes early only to find that the Gresham College lecture theatre was already full to capacity.  User error (and interaction design), it seems, was apparently a very popular subject!

One phrase that I’ve made a note of is that ‘we blame ourselves if we cannot work something’, that we can quickly acquire feelings of embarrassment and incompetence if we do things wrong or make mistakes.  Tony gave us the example that we can become very confused by the simplest of devices, such as doors. 

Doors that are well designed should tell us how they should be used: we rely on visual cues to tell us whether they should be pushed or pulled (which is called affordance), and if we see a handle, then we regularly assume that the door should be pulled (with is our application of the design rule of ‘consistency’).  During this part of Tony’s talk, I could see him drawing heavily on Donald Norman’s book ‘The psychology of everyday things’ (Norman’s work is also featured within the Open University module, M364 Fundamentals of Interaction design).

I’ve made a note of Tony saying that when we interact with systems we take information from many different sources, not just the most obvious.  An interesting example that was given was the Kegworth air disaster (Wikipedia), which occurred since the pilot had turned off the wrong engine, after drawing from experience gained from different but similar aircraft.

Another really interesting example was the case where a pharmacy system was designed to in such a way that drug names could only be 24 characters in length and no more.  This created a situation where different drugs (which had very similar names, but had different effects) could be prescribed by a doctor in combinations which could potentially cause fatal harm to patients.  Both of these examples connect perfectly to the safety argument for good interaction design.  Another argument (that is used in M364) is an economic one, i.e. poor interaction design costs users and businesses both time and money.

Tony touched upon further issues that are also covered in M364.  He said, ‘we interact best [with a system] when we have a helpful mental model of a system’, and our mental models determine our behaviour, and humans (generally) have good intuition when interacting with physical objects (and it is hard to discard the mental models that we form).

Tony argued that it is the job of an interaction designer to help us to create a useful mental model of how a system works, and if there’s a conflict (between what a design tells us and how we think something may work), we can very easily get into trouble very quickly.  One way to help with is to make use of metaphor.  Tony Mann: ‘a strategy is to show something that we understand’, such as a desktop metaphor or a file metaphor on a computer.  I’ve also paraphrased the following interesting idea, that a ‘designer needs to both think like a computer and think like a user’.

One point was clearly emphasised: we can easily choose not to report mistakes.  This means that designers might not always receive important feedback from their users.  Users may to easily think, ‘that’s just a stupid error that I’ve made…’  Good designs, it was argued, prevents errors (which is another important point that is addressed in M364).  Tony also introduced the notion of resilience strategies; things that we do to help us to avoid making mistakes, such as hanging our scarf in a visible place so we remember to take it home after we’ve been somewhere.

The three concluding points were: we’re always too ready to blame ourselves when we make a blunder, that we don’t help designers as often as we ought to, and that good interaction design is difficult (because we need to consider different perspectives).

Tony’s talk touched upon wider (and related) subjects, such as the characteristics of human error and the ways that systems could be designed to minimise the risk of mistakes arising.  If I were to be very mean and offer a criticism, it would be that there was perhaps more of an opportunity to talk about the ‘human’ side of error – but here we begin to step into the domain of cognitive psychology (as well as engineering and mathematics).  This said, his talk was a useful and concise introduction to the importance of good interaction design.

Permalink Add your comment
Share post
Christopher Douce

Interaction design and user experience for motorcyclists

Visible to anyone in the world
Edited by Christopher Douce, Sunday, 9 Feb 2014, 16:04

Has anyone ever uttered the following phrases:  ‘it must be me!’ or ‘I must be stupid, I can’t work this system!’  When you say those words the odds are that it is likely that the problems have little to do with you and have everything to do with the system that you’re trying to use.

Making usable systems and devices is all about understanding different perspectives and thinking about compromises.  Firstly, there’s the user (and understanding what he or she wants to do using a system).  Secondly, there’s the task that has to be completed (and how a task might be connected to other tasks and systems).  Finally, there’s the question of the environment, i.e. the situations in which a product is going to be used.  If you fully understand all these aspects in a lot of depth and balance one aspect against another, then you’ll be able to design a system that is usable (of course, this is a huge simplification of the process of interaction design, but I’m sure that you get my point).

Parking a motorbike

A couple of months ago took a course at my second favourite academic institution, CityLit.  Since it was pretty good weather (despite being January), I decided to ride my scooter into the middle on London and park in one of the parking bays that were not too far from the college.  The only problem was that the City of Westminster has introduced a charging scheme, and this was a system that I hadn’t used before.

This blog post is a polite rant (and reflection) of the banal challenge of trying to pay Westminster council a grand total of one pound and twenty pence.  It turns out that the whole exercise is an interesting example of interaction design since it helps us to think about issues surrounding who the user is, the environment in which a system is used and the task that has to be completed.  Paying for parking sounds like a pretty simple task, doesn’t it?  Well, let me explain…

Expecting trouble

Having heard about the motorcycle parking rules in Westminster, I decided to do some research.  I expecting a simple system where you texted your bike registration number and location code to a designated ‘parking’ telephone number, and through the magic of mobile telephony, one English pound was added to your monthly mobile phone bill and the same English pound was appropriated to Westminster Council.  Well, it turned out to be a bit more complicated than that.  Payments don’t come from your phone account but instead come from your credit card.  This means that you needed to connect your phone number to your credit card number.

When you’ve found the motorbike registration site (which isn’t through a recognisable ‘Westminster Council’ URL), you get to create something called a ‘parking account’.  When logged in, you’re asked to enter the registration number of your vehicle.  In my case, since I’m pretty weird, I have two motorbikes: one that makes the inside of the garage look pretty, and another one (a scooter) that I sometimes use to zip around town on.   There are enough spaces to enter the registration codes for four different bikes. 

The thing is, I can’t remember the registration numbers for any of my bikes!  It turns out that I can hardly remember anything!  I can’t remember my phone number, I can’t remember my credit card number and I can’t remember two registration numbers.  I must be an idiot!  (Thankfully, I remembered my email address, which is something else you need – just make sure you know the password to access your account).

There was another oddity of the whole system.  After you’ve got an account, you login using a PIN code, which is the last four numbers of your credit card.  I never use these four numbers!  Again, I don’t know what they are! (Unless I had to look).  I was starting to get a bit impatient.

Arriving at the parking bay

The ride to the middle of town was great.  It was too early in the day for most people, which meant that the streets were quiet.  After parking my bike, I started to figure out how to pay.  I looked at an information sign, which I immediately saw was covered in city grime.  I also immediately saw that it didn’t have all the information I needed. 

I visited the parking website and discovered that you needed FOUR different numbers!  You needed a phone number, a location number (where your bike is parked), a day code (to indicate how long you’re parking your bike for), and the final four numbers of your registered credit card.  Thankfully, I had the foresight to save the parking telephone number in my phone, so I only had to send three numbers (but I would have rather liked to avoid messing around with my wallet to fish out my credit card; it meant unzipping and then zipping up layers of protective clothing).

Coffee break

At last, I had done it.  I had sent a payment text.  To celebrate my success, I visited a nearby café for a coffee and a sit down.  About ten minutes later, I received a text message that confirmed that I had paid for parking ‘FOR THE WRONG BIKE!’ 

The text message confirmed that I had just paid for parking for my ridiculous bike rather than the sensible city scooter that I had just used.  Also, when I registered both bikes on the system, I entered the scooter registration first, since it would be the bike that I would be using most.  At this point, I had no idea whether the system was clever enough to stupidly assume that I had written either (or both) of my bikes to Westminster at the same time.  There was no clear way to choose one bike as opposed to the other.  Again, I felt like an idiot.

Then, I had a crazy thought – perhaps I ought to try to look at my ‘parking record’, since this way there might be a way to change the vehicle I was using.  I logged in to the magic system (through my smartphone), entering in my last four digits of my credit card, again, and found a screen that seemed to do what I wanted.  It encouraged me to enter start and end dates (what?), and then had a button entitled, ‘generate report’.  A report on what?  The number of toys found in Kinder Eggs that are considered to be dangerous?  I pushed the button.  Nothing happened.  I had no parking history despite having just sent a parking text.  Effective feedback is one of the most obvious and fundamental principles of good usability.

Chat

It took be around five minutes to walk to the college.  When I got there I discovered two other motorcycle parking bays that were just around the corner.  I then made a discovery: it seemed that different bays seemed to have the same location ID.  It then struck me: perhaps the second number I had been entering in the phone was totally redundant!  Perhaps it’s the same code that is used all over London!

 During my class I got chatting to a fellow biker.  After I had emoted about the minor trauma of trying the pay for the parking, my new biker friend said, ‘there’s an app for this…’  Again, I thought ‘why didn’t anyone tell me!’  So, during a break I found the right app and started a download.  After a couple of minutes of nothing happening, I was presented with the delightful message:  ‘Error downloading: 504’.

Final thoughts

A really good interaction design principle is that you should always try to design systems which minimise what users need to remember (there’s this heuristic that has the title ‘visibility of system status’).   On this system, you needed to remember loads of different numbers and codes.  The task is pretty simple.  There is a fixed fee.  The only variable that you might want to enter is either the length of the stay (in days) and the choice of the vehicle.  But what happens if your phone runs out of charge and you want to use a friends phone to pay?  You’ll then have to make a telephone call with an operator, all for the sake of one pound twenty.

There’s also the environment to contend with.  I had to take gloves off, fumble around in my pockets for my mobile phone and then enter numbers.  The information sign was pretty small (and I can’t remember it mentioning anything about using an app).  I dread to think how difficult the process is if English isn’t your first language, and you don’t know that Westminster has bike parking fees.

One final thought is that one approach to learning more about the user experience is to observe users in the things that they do.  This is an approach that has drawn heavily from the social sciences, and on Open University modules such as M364 Interaction Design, subjects and techniques such as Ethnography are introduced.  Another approach to learning about user successes and failures is to search on-line, to learn about the problems other people have experienced.  Although this isn’t explicitly covered in M364, it is an interesting technique.

All this said, the second time that I needed to pay, I used the ‘pay by phone’ parking app.  The ‘504’ error message that I wrote about earlier had miraculously disappeared (why not a message that says, ‘please try again later?) and I was able to download the app and then press a couple of on-screen (virtual) buttons and enter in the last four numbers of my credit card (again, a number that I haven’t yet memorise, since no other system asks me for it…).  I even managed to pay for the right bike, this time!

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 2360030