OU blog

Personal Blogs

Christopher Douce

Teaching and learning programming for mobile and tablet devices: London Metropolitan University

Visible to anyone in the world

On 24 July 2014, I went to a Higher Education Academy sponsored event at London Metropolitan University.  The event was all about programming mobile devices, and it was the third time I had been to this event.  The previous time I went along, I spoke about a new module: TT284 Web Technologies (OU website).  This time I had two purposes: to share something about the beginnings of a new module TM352 Web, Mobile and Cloud (or, more specifically, its main objectives) and to learn what other institutions are getting up to.

A case study…

The first presentation of the day was by Yanguo Jing from London Met (who has organised the event) and Alastair Craig.  They presented ‘a case study of the delivery of a year 12 summer school on mobile app development’ (I had to ask what ‘year 12’ meant: and it means 16 or 17 year olds…): this was a part of an outreach event that London Met run (where students were selected random to participate).

They described some of the challenges that they faced.  Firstly, the students who joined the summer school sometimes had no programming knowledge, and they had to make the summer school fun.  A really big challenge was to try to scaffold the learning so that the students could create something presentable by the end of the week.

At this HEA event last year, a new programming system called TouchDevelop was introduced.  TouchDevelop is a ‘touch friendly’ programming language from Microsoft Research.  (You can check out the kind of apps that have been created by visiting the apps section of the TouchDevelop site).

The language features a touch screen programming interface that is especially design to work with mobile devices; it allows users to choose only the programming constructs that can be selected (it is also graphical in the same sense that Scratch is).  One really interesting aspect of the system is that you don’t have to install anything.  TouchDevelop also creates HTML 5 code, which means that it can be run on a wide range of different devices.

The summer school lasts for a week.  The summer school begins with an introduction to the tool and a discussion of syntax.  The next two days are all about the basics of a game and the game engine.   The fourth day the students are asked to create their own game, and on the fifth day, students are asked to present their games to each other.  Masters level students acted as supervisors. One point was that it seemed that some students (who had some prior programming experience, invariably using Scratch) got ahead with everything.

A fundamental question is, ‘how do you teach people in 18 hours when you don’t know what they know?’  The trick, apparently, is to get them to do things.

Some discussion questions were: ‘is it a good idea [to run this kind of summer school]?’, and ‘does your department do something similar’, and ‘how might you scale up this type of outreach activity?’

One thing that I learnt from the discussion is that there is a new version of Scratch available.  This first presentation ended with a discussion about MOOCs, and the point was made that MOOCs are very different to outreach.

Considering the cloud: teaching mobile, cloud computing and the web

The second presentation of the day was by yours truly.  The aim of the presentation was to talk about some of the areas that a new module about cloud computing may (or may not) cover.  Towards the end of the presentation, I asked all the delegates the following questions:

  • What do you think needs to be taught (cloud, mobile, web?)
  • How might you teach these concepts?
  • What might the challenges be?
  • How might you carry out assessments?
  • How do we protect and inform about change?

As everyone discussed these questions, I made a few notes.  One of the fundamental challenges (with an OU course) is to choose technologies that are not going to age quickly.  ‘The cloud’ is a really fast moving area where there appears to be continual change and innovation; new software services and releases are coming out all of the time.  One way to counter against this is to teach the underlying concepts and not just information about the services.

Another approach is to perhaps concentrate on building a learning community.  Developers and technical specialists invariably live within a community that shares technical knowledge and expertise.  It might be interesting and useful to expose learners to the dynamics of these environments.

An interesting point was both mobile and web platforms are just different ways to consume resources.  Increasingly the ‘web’ is being equated to HTML 5, and HTML 5 is increasingly being embedded within mobile devices.

On the subject of teaching, one delegate made a really interesting and relevant point.  He said, ‘I’ve given up lecturing… half of them just turn off’.  When it comes to teaching the development of mobile apps the thing to do is to split students in to small groups; it is the learning by doing that really counts.

When it comes to assessment, one delegate said, ‘you’ve got to have a project – if you can’t develop an app, then you fail’, and it’s important to get continual updates on progress.  Other approaches might include the use of computer marked multiple-choice questions, and writing about the bigger reflections and lessons from the module.

Poster session

By way of a brief interlude, Yanguo introduced a series of posters that had been put on the wall of the meeting room.  The posters were all about different apps that students had created.  There were two indoor navigation apps, an app for parking (which made me remember one of my blog-rants about poor interaction design), some kind of ‘cash register’ virtual payment app, a food checker or testing app, and a museum guide app.

Bringing the cloud into the classroom

The third presentation of the day was by Paul Boocock, from Staffordshire University.  Paul mentioned that undergrad students are introduced to a range of different platforms: iOS, Android and Windows (if I’ve understood things correctly).  For postgraduate students, there are a number of interesting sounding modules, such as Android app development and Advanced location aware app development.  These link into different mobile technology postgraduate qualifications (Staffs University), such as their Mobile Device Application Development MSc, Postgraduate Certificate (PgCert) and their Postgraduate Diploma (PgDip).

One of the big recent changes to their curriculum is that Staffs is now including ‘the cloud’ into the different mobile modules.  One thing that I should mention is that the concept of ‘the cloud’ is understood in terms of public clouds (as opposed to private clouds that are hosted by the university).

Paul treated us with some pictures of data centres, and said ‘[the cloud] is changing how we teaching this stuff’.  He left us with an interesting idea: ‘what used to take 30 days to get up and running can now be achieved in 30 minutes’.  The point was simple: you no longer need to buy, configure and commission servers.  The benefits of ‘the cloud’ include potential lower costs, scaling and the potential of gaining global reach.  In some respects, it might become more difficult to become more directly exposed to the physical hardware that runs systems.

We were introduced a term that was unfamiliar to me: cloud computing patterns.  The term relates to the way that cloud systems can be consumed as opposed to how they are designed.  Some patterns include on/off, i.e. an application might experience high levels of demand for a while (a bit like batch jobs), that a product or system might take off very quickly (so there would be increases in demand), or there might be predicable or unpredictable bursts of traffic (such as within computer games, for example).

Paul also talked about different platforms.  He mentioned a good number that I had heard of (but I’m not intimately familiar with).  These were Amazon (of course), Microsoft, Rackspace, HP Public cloud, and Google Cloud.  Given that his focus was on public clouds for teaching purposes, he discounted HP and Rackspace (I think due to cost), and then considered Amazon.

Amazon apparently offer something called educational grants (Amazon website), which allow educators to gain free credits to allow computing students to use their services.  The trade is that students who use the Amazon systems will be able to take their skills directly into the workplace.  Apparently, you can tell them how many students you have, and then they sort out the number of licences (or credits).

We learnt that Microsoft (of course) run a similar scheme, which enable students to use Azure academic passes (Microsoft Azure website).  Google was not considered as an alternative since there are no current discounts for non-profit organisations.  In the case of Staffordshire, Paul opted for Microsoft mainly because they had already made an investment into Microsoft tools and environment.

Before a live coding demo, which featured a pre-built service (from what I’ve noted) we were given a brief description of the different Azure components (or Azure services).  These were: compute, app services, data services, and network (this reminded me that I’ve come across similar terms when looking at the open source equivalent called OpenStack).

At the end of Paul’s session there was a lot of time for discussion.

Points of discussion included the challenge of working with different SDKs, and the emphasis on design patterns.  On the masters course, student were asked to create an interactive chat app that wasn’t not too dissimilar to the hugely popular WhatsApp.

Of course, there are always challenges that educators need to be mindful of.  These include the need to change modules without increasing their difficulties, and the question of how to assess everything if everything exists in the cloud (and students create services using lots of template code).  One way to do this is, of course, to ask students to write a reflective report about what they did to get a sense of what they understand.

All in all, it was both really interesting and really useful to know how another institution had successfully tackled the introduction of programming the cloud into their computing curriculum.

Developing digital literacies

The fourth talk of the day was by Terry McAndrew, which had the subtitle, ‘how students can quickly create interactive media resources for your curriculum’. Terry spoke about the broad subject of ‘digital literacy’ which can be defined as ‘the ability to effectively engage with a range of digital technologies to create, navigate and manipulate information’.  Terry mentioned a resource known as a JISC Digital Literacy InfoKit (JISC website).   The key contains seven different areas, which are: information literacy, media literacy, communication and collaboration, career and identify management (which I understand to be a new bit), ICT literacy, learning skills and digital scholarship.  A two year digital literacy programme (JISC) was also mentioned.

Interestingly, Yanguo mentioned some digital literacy resources that were available from London Met.  There’s also another bunch of digital literacy resources from the University of Southampton.  All these different resources made me realise that perhaps this is an area that I really need to catch up on.

Another part of Tony’s presentation centred upon accessibility.  Tony mentioned a tool called Xerte (University of Nottingham) which can be used to create accessible digital material which can be delivered through a virtual learning environment to different devices.  It’s a tool that is sometimes used by students who are studying a module that I tutor, H810 Accessible online learning: supporting disabled students (OU website).  The content that is delivered is presented using HTML 5, but the editor uses Adobe Flash (we were, however, told that there are plans afoot to develop an HTML 5 based editing environment).

Two other interesting links (and projects) mentioned were JORUM, a repository of digital educational material that can be shared between different institutions.  JORUM has been going for quite a while, and I hadn’t heard it mention for quite some time.  Having a quick look at the JORUM site quickly tells me that it has changed quite a bit since I first looked at it properly (which must have been around six or seven years ago).  The second reference was to a project called ACTOER, which is an abbreviation for Accessibility Challenges and Techniques for Open Educational Resources (of which Terry, who is based at TechDis, is the project manager).

I enjoyed Terry’s talk, and I found his presentation of different digital literacy resources useful, but there was little about the learning and teaching of how to program mobile devices.  This said, accessibility is always really important, and it’s something that designers of curriculum need to always be mindful of: I welcomed Terry’s reminders.

Alignment of mobile learning agenda with learning and teaching strategies in HEIs

The final presentation of the day was by Remy Olasoji from the University of East London.  From what I remember, I understand Remy to be an expert in the field of requirements engineering.  He presentation was about taking lessons from requirement engineering to try to understand how best to make use of mobile technology.

A final question of the day was, ‘how do we drive the mobile agenda forward?’  A simple answer was: ‘mobile is already happening – it’s driving forward of its own accord’.  One challenge lies with figuring out how to teach the fundamentals of mobile technologies to enable students to be thoroughly equipped and prepared when they have to work with new and changing devices.  Another challenge lies with figuring out how to best make use of devices to help students with their studies.

Reflections

All in all, a useful event; it’s always useful to hear what happens within other institutions and to learn about what challenges educators need to overcome.  One area that I would like to have heard more discussion about is information and data security.  The ‘cloud’ exposes these issues quite naturally, along with issues that relate to business and management.

Permalink Add your comment
Share post
Christopher Douce

HEA Workshop: Teaching and learning programming for mobile and tablet devices

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Mar 2014, 18:44

On 25 June 2013, I popped over to the London Metropolitan University to attend a HEA sponsored workshop that was all about how to best teach the programming of mobile devices.  My role there was to present something about an OU module that I help out on: TT284 Web Technologies, but I'll be saying a bit more about that in a little while.

Yanguo Jing, from London Met kicked off the day by talking about the twenty credit MSc module that he leads.  Yanguo said that his module is strongly connected with industry and various technology vendors and important themes are that of innovation and enterprise.  Importantly, students have an opportunity to carry out research themselves, create their own projects, develop their own apps and present their own findings.  One way that they do this is by making their own videos (which is also a great way to create evidence which can be contributed to assessments).

Yanguo also mentioned something called the Wow Agency. One of the important points of having a more direct connection with industry is that students get more immediate exposure to demands from industry.  This was thought provoking stuff.

Teach the future, not the past: Blackberry 10 development

Luca Sale and Simon Howard gave the first of two vendor presentations.  I'll put my hand up and say that I know next to nothing about developing applications for Blackberry devices.  In fact, I don't think I've ever used a Blackberry device other than to scroll through a message, when a friend briefly gave me their device to look at!

This presentation was all about developing for a new device, the Blackberry 10.  I have heard bits and pieces about this, but the new device has a totally new operating system called Z10.  Interestingly, it is based on an operating system called QNX (Wikipedia) (which I had vaguely heard of before).  Basically, it uses a microkernal architecture (which means it has a way to enforce stronger separation between the hardware and the main operating system that runs a device), it's pretty small, and is used in a range of different embedded systems.

Apparently, there are number of Software Development kits (SDKs) which means that it's possible to take an existing Android app and port it to the Blackberry (and have it deployed to users via the Blackberry equivalent of an app store).  The SDKs that were mentioned included Qt, HTML 5, Blackberry native, Adobe Air, and Java Android Runtime.

There was a quick live coding demo of how to create apps using the HTML 5 framework.  Other languages that might be used to craft code included Javascript (in conjunction with HTML 5), C++, and Java (as far as I understand).  At the end of the presentations, Nafeesa Dajda described the Blackberry Academic Programme (Blackberry).

Microsoft devices and services

Lee Stott continued the vendor specific part of the day by making a Microsoft themed presentation.  Microsoft, of course, has been investing significantly into the mobile devices space.  Not only do they have Windows phones, but they (of course) also have their Touch PCs.  So much so, that their new operating system (Windows 8) aims to create an experience specifically for tablet devices.

Lee talked about software eco systems and mentioned that services (as well as devices) are important too.   Services can also be thought of in terms of cloud services, and we were told that the cloud was becoming more and more important.  Since data is stored elsewhere, users have the potential to move between different devices and still have access to their documents and data, thus enhancing the user experience.  

One of the most interesting part of Lee's talk was where he spoke about the Microsoft Azure services.  I have to confess that it's been quite a while since I've been a Microsoft developer (in the intervening years I've done some PHP and coding using open-source frameworks), so it was useful to learn what the company has been up to and what services they are offering.

One of the challenges that I've always puzzled over is if you run your own tech company, how you might go about running and maintaining your own servers and databases.  System administration is a necessary, important and essential evil: getting to grips with real kit and devices is important, but is a detailed technical specialism in its own right.

If I've understood this correctly, Microsoft can host a virtual server which then can host your own database.  I'm also assuming that if you want, you can also write your own web services to do whatever magic stuff you need to do, which can then be consumed by users of mobile devices (or any other kind of client).  Customers of this service are then billed per minute of processor time.  I can see the benefits; server plant depreciates quickly and keeping them maintained is always going to cost money.  I find this approach to hosting and consuming data really interesting, especially since it offers an approach to devolve risk to a third party.  Of course, there are a number of competitors (Wikipedia) in the cloud services arena.  This whole area seems to be a new subject in its own right.

Just in case you're interested, here's a couple of links I've gathered up: the main Microsoft Faculty pages, the UK faculty connection blog, and a link to the Azure education blog.   Another link is DreamSpark which seems to be about giving students and institutions access to some of the latest tools and technologies.

TouchDevelop for Windows Mobile 8

The next talk was by David Renton, who is a lecturer in Computer Games Development.  David introduces a platform called TouchDevelop (Microsoft website) which used to be a Microsoft Research project. TouchDevelop is a programming language that has a graphical feel.  Programs that are created using it have an appearance of a textual language, but elements of code can be created using a series of menus (as far as I can understand).

The software that you can create using TouchDevelop can be run on different mobile devices.  In some respects, TouchDevelop occupies the same space as Scratch (MIT website).  David makes the point that it's difficult to create good games in Scratch.  I can (personally) neither confirm nor deny David's assertion, but my own view is that Scratch is a fun and useful environment which allows users to escape from the tyranny of syntax, whilst at the same gradually introducing users to different (and essential) programming constructs.

What was really interesting was that TouchDevelop contains cool stuff, such as a physics engine.  By providing such a facility, I can certainly see how and why such an environment could be particularly interesting and engaging.  Again, for those who are interested, David has a blog called Games4Learning.  A final interesting point is that TouchDevelop runs in a web browser, so will work on different platforms.

Shorter presentations: Lua and Corona, Digital Summer Camp

Ian Masters gave a short presentation entitled, 'teaching cross-platform mobile development using Lua and Corona'.  Corona (website) is a software development SDK and Lua (Wikipedia) is a programming language.  Like TouchDevelop, Ian demonstrated the use of an integral physics engine.  During the follow on discussion, there was quite a bit of talk about the Unity Engine (Wikipedia), which I've heard mentioned at a number of other HEA gaming and mobile events.

Martin Underwood talked about Digital Summer Camp which is an event where universities, colleges, industry vendors and other organisations have come together help to inspire young people who are interested in technology.  The Open University is also one of the 'digital skill leaders'.

iPhone game development at Robert Gordon University

Gordon Eccleston has been teaching the development of apps for quite some time.  He gave a short talk on what works and what hasn't worked.  Gordon introduced a new term: a flip classroom!  I hadn't heard this term before, but apparently this is where students do some preparatory work at home to prepare for tutorials (I think I've got that right!)

Gordon spoke about how things have changed.  These days students invariable have their own devices.  One difficulty is that vendors are always changing their devices, which means that lecturers face challenge in terms of an inability to control in their own environment.  This said Gordon does have access to some iPod Touch devices, allowing code created using the XCode platform (the environment used to create iOS applications) to real devices.

Gordon also mentioned that the school had access to the Unity3D engine.  This gave way to an interesting discussion about the difference between games programming versus games design courses.  I've also made a note that when it comes to submission of course work, submission to an apps store represents one judgement on quality.  When it comes to further assessment by the lecturer, one approach is to ask students to create a screen cast.  Assessment, I seem to recall, is a perpetual challenge (especially with the continual changes in technology), as is how to provide both teaching and resources through a web based environment.

Mobile apps development: enhancing student employability

Sally Smith and Scott McGowan, both from Edinburgh Napier University gave a short talk and presentation on the importance of employability skills.  Sally, who is the head of school, said that employers value relevant experience, want to see applicants who have a relevant degree, and have good soft skills. 

Faced with the necessity to demonstrate employability skills, it was argued that it would be useful if students could create something (say, an app, or some other related project) that can be both added to a CV and talked about in an interview.  Sally also talked about the importance of industrial experience and how her institution and school tackled this issue.

Teaching and assessment strategies in mobile development

David Glass teaches mobile development to second year undergraduates at the University of Ulster.  Students can create apps for the Android platform with Java using Eclipse.  Important parts of the module that I've noted down are subjects such as user interface design, data persistence and networking.  There is also a period of self-study where students are to gain an overview of mobile devices.

Challenges include teaching of programming and understanding what to assess and how.  The assessment approach that David mentions sounds really interesting.  Students are required to address legal, ethical and social issues.  They are then required to develop a basic app before moving on to creating something that is more advanced.  A basic app might be something such as a simple calculator or a measurement converter.  

Interestingly, a more advanced app might be something called a 'my run tracker' app.  David made the important point that the task of creating apps lends themselves to more open-ended assessment and group work.  Taking this approach has the potential to encourage creativity and help with motivation.

Design designers, don't program programmers

Lindsay Marshall, from the University of Newcastle, gave an impromptu talk that described his own ten credit postgraduate module and connected with many of the earlier debates.  At the end of his module, students are required to submit a portfolio.  Relating to the challenges of assessments, students were allowed to choose whatever platform they wanted, and choose whatever problem they wished to solve.  Students were encouraged to produce a design log and to present some kind of demonstration.  Moving forward this may take the form of a video presentation or recording.

Lindsay made the important point that it is also important to take the time to look at the code, as well as the final product.  Another component is the writing of a reflective essay, to describe what was learnt during the project.  Interestingly, there are no lab sessions.  Instead, Lindsay mentioned the importance of crit sessions, which is an important technique used in design.

What was really struck me from Lindsay's presentation was something that was also pretty obvious: that there are significant connections between the design disciplines and software development.  Both are fundamentally creative subjects, and both require people to understand the inherent nature and characteristics of problems.

Web technologies

And finally, it was my turn.  During my slot I spoke about a new Open University module called Web Technologies (TT284, Open University website), emphasising the point that there are so many important technologies that underpin the use of mobile technologies and devices. 

TT284 is interesting in a number of different ways.  Firstly, is uses a set of case studies of increasing size.  Students move from understanding how to create an app for a small club or society, through to understanding what might happen as a part of a software development company.  Students are then introduced to 'software in the large' (or sites that have incredibly high volumes), and what practical issues might need to be addressed.

When it comes to mobile technologies, drawing on a case study, students are asked to create an app for an Android device using MIT App Inventor (MIT website).  App Inventor is a graphical programming language, where code can be moved to real advices.  One of the challenges for any module that aims to either teach mobile technologies is the way that technology changes so quickly.  A really good aspect of this particular module is that it also addresses a good number of fundamental and really important standards and technologies.

Reflections

I learnt quite a lot from the vendor presentations and it's always useful to hear about the industrial perspective, particularly in a field that is moving so phenomenally quickly.  Whilst it's great for academics to learn what industry is getting up to (and you might argue that this is a thoroughly essential part of the job description), the presence of vendors links to an implicit battle for the hearts and mind for developers.  Users choose devices and technology that allows them to do cool stuff.  Cool stuff is created by developers.  Developers, in many cases, come from universities.  Taking this even further, developers are employed by industries who ultimately want people to be skilled in using particular software infrastructures and ecologies. 

Things have changed since I first started to go to these mobile technology events.  There are now many more devices than there were before.  The devices themselves have changed - they have more memory and power, and on the horizon there is a new generation of faster mobile networks.  By the same token, there are, of course, new tools, development environments, frameworks and libraries.  Educators are faced with the challenge of what to teach.  Some educators choose particular platforms, whereas others leave this decision entirely up to students.

When it comes to pedagogy, project and group work appears to be fundamentally important, particularly when it comes to developing employability skills and creating artefacts that can be presented to potential employers.  Keeping things open (in terms of either platforms or the problems that can be solved by the application of mobile technology) can present some challenges when it comes to assessment.  There seems to be some consensus in terms of asking students to produce videos of their working apps might be a good approach.

Making a decision about what platform to use or to develop for isn't an easy one.  When I was a student I was once told by a faculty member that 'you really need to know how to use all types of technology'.  His point was that you will more readily be able to move between one platform and another.  In doing so, you'll gain a degree of flexibility that will allow you to appreciate how things might be done in different ways.  This is a perspective that has stuck with me and one that is important since the platform that you're using now will eventually become obsolete in a couple of years' time.

When it comes to mobile technology, everyone is trying to figure what things we should be teaching and what the best approaches for teaching might be.  When we're dealing with an industry that is moving as quick as it is, these kind of events can be useful in terms of making connections and putting a marker in the ground whilst saying, 'this is how we do things today'. 

Permalink Add your comment
Share post
Christopher Douce

Teaching and learning programming for mobile and tablet devices

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Mar 2014, 18:45

I attended a HEA workshop about the teaching and learning of programming for mobile and tablet devices at London Metropolitan University on 15 June 2012.  This is a quick summary of my own take on what happened on the day, combined with a set of personal reflections, some of which I've added in the body of this summary.  I'm writing this with the hope that this summary might be useful for some of the attendees, and for others who were unable to attend.

In some ways, this was a second of a 'mini series' of two workshops about mobile technologies, the first being held in the University of Buckingham back in May 2012.  A quick write up of this earlier workshop, which has more of a focus on employability skills can be viewed by visiting an earlier blog post.

The day began with an introduction by Dominic Palmer-Brown who clearly emphasised the importance of mobile technologies.  Dominic commented that the subject is particularly important 'to ourselves and our students', going on to emphasise that skills working with and developing mobile technologies are in demand by industry.  A number of presentations appeared to confirm that this was the case, particularly the final presentation.

The potential of social media and mobile devices in informal, professional and work-based learning

Professor John Cook, from London Metropolitan University gave an opening keynote about how mobile devices could be used to help facilitate teaching and learning.  John introduced us to a number of different ideas and projects, enabling us to appreciate the variety of ways in which mobile devices may be used.  Mobile devices can be used to 'add information' to physical space, for instance, reminding me of research into wearable computing and the development of Google Goggles, for instance.

Connecting to the themes of location, history and learning, John introduces us to a project that enabled students, through the use of mobile devices, to learn more about the ruins of a Cistercian Abbey (Fountains Abbey, Wikipedia).  Mobile devices facilitate the delivery of different types of media which can be chosen depending upon the location of the user

Whilst technology on its own is always interesting, its use and application can be enhanced through the understanding and application of pedagogic theories.  John made reference to Vygotsky (Wikipedia), who coined the term Zone of Proximal Development (Wikipedia).  Other important points that I've noted is the role that peers play a very important role in learning, and John emphasised the importance of scaffolding of learning activities (the subject of pedagogy, particularly inquiry based learning was the focus of an earlier HEA event).  On a related note, I personally feel I have a fair way to go in terms of understanding how to make the best use of the technologies I have at my disposal.  The pedagogy of technology is something that I am sure that I'll continue to mention in these blogs.

John also introduced an abbreviation that I was not familiar with: BOYD, meaning, Bring Your Own Device.  Perhaps it has already got to a point where it may be surprising if a student doesn't bring some kind of mobile technology to their lectures. 

It was interesting to hear the view that social media used in the work place was considered to be an area that is under researched.  This thought reminded me of an earlier presentation by Vanessa Gough, from IBM at a previous HEA workshop about professional on-line identities where she showed how employees were making use of social media to share information with each other.  Perhaps it is an area that is under researched, but I do sense that social media within the work place is certainly being used and applied.

John also mentioned a new EU funded project called Learning Layers.  Like many EU projects, Learning Layers has a number of collaborators from different countries. Finally, some slides that connect to the ideas and the projects that John spoke of can be found on SlideShare.

Teaching Mobile App Development at Postgraduate level at London Metropolitan

Yanguo Jin gave the first 'main' presentation of the day where he shared with us some of the experience that had been gained at London Met over the past five or six years.  Yanguo made reference to an industry report which predicted that mobile internet will take over fixed internet by 2014.  It was also viewed that mobile technology skills, such as HTML 5, iOS and Android are considered to be increasingly important.

Knowing about a particular skill is one thing, being able to demonstrate mastery in something is a different (but related issue).  To address this challenge Yanguo holds the view that students should ideally create a portfolio of apps (perhaps in combination with other students) to demonstrate their skills and abilities to a prospective employer.

Teaching of mobile technologies at London Met is through an industry-oriented practical approach that emphasises depth (in terms of making use of a single platform) as opposed to breadth (covering a number of different platforms).  I think this is important, since whatever platforms developers end up using, they always have got to 'get into the detail' of the environments and tools that they have to use. 

Key subjects that are covered in the module includes the model-view controller (MVC) design pattern, the use of an integrated development environment (IDE), aspects of visual design, issues relating to power and memory management, web services, development methods and object-oriented programming.

One particular aspect of the teaching that was said to work well is the facilitation of peer-to-peer support (a point which connected to John's keynote).  Another great technique was to encourage students to teach each other through their own seminars, and allowing them to choose their own projects (thus helping to keep students motivated).

Approaches to teaching programming of mobile devices

Gordon Eccleston from Robert Gordon University shared with us some of his experienced he gained whilst teaching students to develop iPod and iPhone apps. Gordon began by asking an interesting question.  He said, 'is programming mobile devices different to other kinds of programming, such as programming using Java or .NET?'  His answer is 'not really'.  Like with other aspects of programming the only real way to learn is to get on and do it.  Gordon also made an argument that we might get to a point where we may not distinguish between different types of device, such as a phone, a tablet or a laptop - we may end up calling them all 'computers' (especially that some mobile phones are now as powerful, computationally speaking, as laptops).  At some point in time, mobility may be an attribute that we automatically assume.

Gordon echoed John's earlier comments about BOYD.  Whilst at the moment Gordon provides his students with a set of iPod Touch devices which they can use (separately from any other device that they may own), one important consideration when teaching mobility may be the availability of effective WiFi in the classroom.

Increasingly, students may wish to work from home or work part time (which connects to John's earlier keynote on the subject of mobile learning).  To facilitate different ways of learning, institutions can make use of technology to allow students to gain access to learning.  Material can, of course, be delivered through an institutional VLE.

Gordon concluded his presentation by speaking about interactive books, which I remember reading was going to be Steve Job's 'next big thing'.  Gordon mentioned a company named Giglets which produces interactive multimedia 'books' for either PCs or eBook readers.  There is also the increasing possibility (or, even, likelihood) that students in primary schools may begin to make use of tablet devices.  

This broader discussion about tablet devices in schools made me begin to wonder about the extent to which digital books and institutional services or systems (such as VLEs) can be connected together and how institutions can support the use of mobile technology through the use of organisational structures.  Whilst technology may sometimes help, organisational structures and support must always facilitate its use, but understanding how to best achieve this can be a whole different challenge.

Teaching Android Programming at Oxford Brookes

Ian Bayley and Faye Mitchell gave a joint presentation about their experience of teaching Android programming at Oxford Brookes.  I remember hearing that they clearly emphasise that mobility is a whole lot more than just the phone.  I completely agree.  One interesting observation is the programming is an activity that is continually difficult.  When it comes to learning how to program, high levels of motivation is really important.  An interesting point is that students who may be strong at mathematics can find programming difficult.  Whilst mathematical skills may be useful, 'algorithmic thinking' may be something that is quite different.

Students are introduced to programming through the use of other tools and languages, such as Alice (which has been mentioned at a number of other HEA events), and Processing (which is a Java-based language that can be used to create graphics and data visualisations, for example).

I also remember hearing about the creation of screencasts to allow students to get a more direct understanding of some of the applications that are used.  Towards the end of the presentation there was time to discuss assessments.  Students are given the opportunity to create their own app.  Examinations, it was argued, was considered to be an inappropriate way to assess knowledge and understanding.  This is especially pertinent given the practical nature of mobile programming.

Bedfordshire's Experiences teaching app development with Lua and Corona SDK

Ian Masters presentation was very different from the others.  Ian's talk was more of a demonstration of two different (and related) developments: a programming language called Lua (which I had never heard of), and a corresponding SDK called Corona (which I had also never heard of).  In combination with each other they can represent a 2D game development environment for different mobile devices.  Interestingly, Lua and Corona are multi-platform, which means that code is (of course) transferrable between different mobile operating systems and devices, making it a really attractive tool.

Ian began his presentation by defining a simple environment in which a game may be played.  This involved defining screen elements, such as a floor, and also blocks.  Another interesting aspect of the environment is that Corona also comes with its own physics engine.  Items that are defined on the screen can bounce on and fall off items that have been defined.  It looks to be really good fun!

Mobile Teaching Experience from University of Buckingham

Harin Sellahewa told us about a new module that is being taught at the University of Buckingham from September onwards.  The aims of the module is to introduce students to mobile application develop, to help them to create a realistic app and to enable students to understand the wider commercial opportunities and issues that surround mobile app development.

Some of the learning objectives include understanding the components of a smartphone (such as its various peripherals), to critically understand the difference between mobile devices and PCs and for students to be able to design, develop and test applications.  Interestingly, the module is using a Windows development platform.  One reason for this different focus is due to familiarity with the Xbox development environment that Buckingham already uses.  I look forward to hearing about how the first presentation went and what challenges were overcome.

Our experience of teaching mobile programming on different platforms at Staffordshire University

Catherine French and Dave Gillibrand presented some of their experiences of teaching mobile programming at Staffordshire University.  It was great to see that mobility has been a subject that has been taught at Staffordshire for quite some time, beginning with Java ME and Windows CE (PDAs) before moving onto Android and iOS.

One of the tasks (or assignments) that students are presented with is the challenge of creating a 2D game, which sounds like a tough challenge.  To address this issue, a very useful and helpful teaching paradigm has been adopted where students are given code examples where students are then encouraged to change the example.  This was considered to be particularly useful with some aspects of programming, such as multi-threading, which students can find difficult.

I hold the view that using examples is a really good idea; I very often used this strategy when I was working in industry.  Examples give students a combination of relatively immediate results (which can be rewarding) whilst also providing the materials that allow learners to gain an understanding of how things work, which may be only acquired over time.

An important point that was made is that a using a real mobile device is so much better than an emulator.  Whilst an emulator can simulate the operation of some mobile peripherals, such as the GPS sensor, for example, other aspects of a mobile device, such as the behaviour of the touch screen are best experienced (and tested) with a real device.

I was impressed by the breadth of subjects that students may be introduced to as a part of their studies.  These may include consuming public web services, development of an application using agile techniques which can include the use of test driven development (TDD) and using tools that are used in industry, such as Subversion.

A final point is that some students may begin a module with the view that developing apps may be something that could be easy.  Programming is something that certainly isn't easy.   I guess a personal reflection is that educators not only need to convey difficult technical concepts and expose problem solving challenges to students, educators also need to work to manage expectations.  Programming, irrespective of whatever form it takes, is a craft and it takes time to acquire craft knowledge (and experience).

From the desktop to devices: teaching interaction design

I have to confess that I was responsible for the penultimate presentation of the day.  Tempting though it is, I'm not going to write in the third person for this part of this blog.  Instead, I'll refer to myself as 'I' as opposed to 'Chris'.

My own presentation was slightly different than all the others since it wasn't about mobile technology or even about programming.  Instead it focused upon the process of designing interactive products and experiences (of which, programming will eventually play an important part).  My presentation was based on experience gained as an Open University associate lecturer over the past six or so years where I have tutored a module entitled Fundamentals of Interaction Design (which I'll call M364).

M364 is a great module.  It introduces students to key concepts such as usability goals, user experience goals and design principles.  It then helps students to appreciate the power of sketching.  Students are introduced to the concepts of evaluation where they are then encouraged to understand the advantages and disadvantages of different approaches.

During my presentation I described a scenario where a mobile device to guide a visitor around a historical location needed to be designed.  I quickly outlined different types of sketches.  The first was a storyboard, which enables designers to think about the broader context in which a product is used.  The second is a card-based prototype which allows designers to consider the sequence of interactions (and even simulate them).  The final sketch was a more detailed interface sketch which contained more detailed design about icons and how information is presented to a user.

The title of my brief presentation reflects the notion that the design process can be applied to many different kinds of platforms and devices.  Not only can the interaction design process be applied to mobile or desktop applications, but also to static devices, such as ticket machines, for example.

Why teaching mobile? An Industry's perspective

The final presentation of the day was by Abdul Hamid.  One of the striking aspects of Abdul's presentation was where he shared with us some graphs from an on-line job site (Indeed) which emphasised the demand for certain mobile skills.  Some older skills, it was argued, were waning in popularity whilst others (particular those that were mobile related) were becoming increasingly popular.

Reflections

I felt that this was a very cohesive event, in the sense that there were a number of presentations that were entirely dedicated to sharing of not only teaching practice (and insights about what works and what doesn't), but there was a lot of commonality in terms of technologies and tools.  Although there were many high points of the day, the highlight for me was finding out about Lua and Corona.  I had never heard of these tools before, which reminded me of how difficult it is sometimes to keep up to date in a fast moving field, such as mobile technology and software development.

As mentioned earlier, technology is a part of a bigger picture.  John's presentation touched upon the importance of theory and history, particularly with regards to the domain of mobile learning.  Mobile has an important role to play within business, commerce and our wider social environment.  Other disciplines will undoubtedly play an increasing role when it to understanding the increasing role that mobile technology plays in our everyday lives.  Just to echo words from John's keynote, pedagogy, usability and content are all important areas.

At the end of the workshop there was a short opportunity to discuss how the participants could potentially work together, collaborate and continue to share practice.  There was also some debate about having a follow up meeting next year: a really positive outcome - congratulations to the organisers at London Met!

Permalink Add your comment
Share post
Christopher Douce

Life in the fast lane? Towards a sociology of technology and time

Visible to anyone in the world

On a recent trip to Milton Keynes on 29 May 2012 I had the opportunity to attend a Society and Information Research Group (SIRG) seminar by Judy Wacjman (LSE).  Judy is a Professor of Sociology at the London School of Economics.   Judy's presentation, very broadly speaking, was about technology and time and whether one affects the other.  Her seminar was related to research that may feed into a book that she is currently working on.  This post is a personal reflection of some of the themes that struck me as being significant and important in my own work.  Others who attended the seminar are very likely to have picked up on other issues (and I encourage them to add comments below).

Timing

For me, the timing of her seminar couldn't have been better.  My last blog was about an event that shared practice about how lecturers and institutions could most effectively help students to develop software for mobile devices.  During this event mobility was portrayed as an opportunity, but there is also was an implicit assertion that mobile technology will change how we work.  In doing so, mobile technology can affect how we spend our time.

Productive work may not cease the moment that we now leave the office, but instead can now continue for the duration of our commute home.  Work may invade on our personal time too, since we can easily take our devices away on holiday with us.  Important messages that are concluded with a succinct, 'sent from my iPhone', clearly suggests that we are working whilst we are on the move.

Judy mentioned that perhaps some of these concerns mainly relate to 'management or professional types', and this might be the case.  But one way to really understand the issue (of time, and how it is affected by technology) is to carry out studies, particularly ethnographic studies to conduct observations about how people really use technology.

Research methods

Such methods are briefly discussed within a module, such as M364 Fundamentals of Interaction Design, which is concerned with how to make devices and systems that are usable to people.  Two approaches used for the evaluation of the success of products includes ethnographic studies (observing users), and asking them to complete diary studies.  Judy's presentation emphasised the point that interdisciplinary research is a necessity if we are to understand the way in which technology impacts our lives.

Judy managed to connect my immediate concerns about mobile technology and its impact on our time with earlier debates.  Introductions of devices, such as washing machines and other labour saving devices were touted to 'save time'.  This raised the questions of 'what happens when we get that time back?  How might we spend it?'  Unpicking these questions leads us into further interesting debates, which relate to the different ways in which men and women use the time that they have available, and towards the broader concerns of capitalism.

One point that Judy mentioned in passing (which I've remembered reading or hearing before) is that perhaps we have been 'cheated by capitalism'.  Perhaps the extra time we have gained hasn't been spent on leisure, but instead has been spent on doing even more work, which allows us to buy more stuff (since, perhaps, everyone else is doing the same).  A personal reflection is that mobile devices also act as devices of consumption.  Not only do they facilitate the extension of work into our 'dead time', but also permit us to browse eBay and on-line stores whilst travelling on a train, for instance.

Technology and speed

Returning to the main debate, does technology cause us to work 'faster' or more?  Is the pace of our lives accelerating because we can access so much more information than ever before? Judy urges caution and asks us to consider causality.  On one hand there is technological determinism (wikipedia), but on the other there is social determinism (wikipedia).  Mobility can facilitate new ways of interacting with people, which may then, in turn, give rise to new technologies.  It could be argued that one helps to shape the other mutually.

Judy cautions against having the individual as the focus of our attention.  People live and work with each other.  Perhaps the household should be the focus of our attention when it comes to understanding the influence of technology on our lives.

What was clear from Judy's seminar was that there were many different areas of literature that could be brought to bear on understanding technology, time and how we spend it.  During her talk I made a note of a number of references that might be interesting to some.  The first was an edited book entitled High-speed society: social acceleration, power, and modernity, edited by Hartmut Rosa and William E Scheuerman.  The second was entitled, Shock of the old: technology and global history since 1900, by David Edgerton.  The final book that I have extracted from my notes is that of, Alone together: why we expect more from technology and less from each other, by Sherry Turkle (MIT, homepage).

Reflections

An enjoyable and thought provoking seminar which highlighted an important point that when you begin to scratch the surface of a question you then open up a broader set of connected and related issues.  Important subjects include the importance of the wider context in which technology is used and what tools and approaches we might use to understand our environment.  I was reminded of the obvious truth that, given technology firmly exists within the human context, learning from disciplines such as history and sociology is as important as drawing upon lessons from science and engineering.

 

Permalink Add your comment
Share post
Christopher Douce

Mobile Application Development: from curriculum design to graduate employability

Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 19 Oct 2021, 11:26

I had never visited the University of Buckingham before.  It was on the morning of Tuesday 15 May 2012 that I found myself travelling to Milton Keynes railway station to meet with a pre-booked taxi that would whisk me into the unknowns of the Buckinghamshire countryside towards an event that was intended to share practice about the teaching of mobile technology.  Although I had never visited Buckingham, I have heard it being spoken of many times before; a radical institution which was founded at approximately the same time as another radical institution, the Open University. 

As well as sharing practice about the teaching of mobile application development another really important theme was the subject of employability and the open question of whether universities are 'teaching the right stuff' to enable graduates to immediately make a contribution in the workplace.

This blog post is a summary of a visit to a HEA event entitled 'Mobile Application Development: from curriculum design to graduate employability'.  If I've missed any key points, I encourage the fellow participants and delegates to add comments below.

Industry keynote

Lee Stott, an academic evangelist from Microsoft kicked off the day with a really interesting keynote.  Lee is from a part of Microsoft that works with university departments (Microsoft Faculty pages).

Lee emphasised the point that users expect connectivity.  I made a note of an interesting quote that went 'mobility plus cloud equals opportunity'.  It's easy to imagine (or even remember) situations where one gained access to information whilst travelling, solving problem, such as finding an address of a location or accessing some urgently needed information.

Lee also made the point that mobile devices are our predominant work tool (or tools).  A tool, of course, might be a phone or a laptop.  This is certainly true in my case; I often haul my laptop between the OU's headquarters in Milton Keynes and my home, sometimes using the dead time on a train to do some marking.  Another thought that comes to mind is whether mobility is causing work time to encroach on our personal time, but this is a whole other debate (and one that I hope to connect with by writing another blog post about a recent seminar).

The usefulness of an app depends on a combination on its functionality, the functionality of a device and the availability of a network.  To be useful, apps need to be useful but also graphically appealing.  Lee emphasised the importance of designers, not just software designers, but graphic designers.  This connects to an important point which is that creating good apps is an interdisciplinary activity - a combination of technology, business and art. Writing commercial apps isn't just about writing software that works - they need to be 'hardened'; tested thoroughly and be checked for vulnerabilities.

Microsoft, along with other mobile platform vendors (such as Google and Apple) have their own ecosystem of tools, technologies and platforms.  Microsoft is but one of many platforms that educators can choose from.

I have to confess (for my sins) that I used to be a software developer who mostly specialised in Microsoft technologies.  I used to use .NET, MS SQL and a bunch of other stuff.  It has been, however, a few years since I've done this.  Lee introduced new technologies that were entirely new to me, such as Microsoft Azure (wikipedia) and Microsoft XNA (wikipedia) for Xbox.  Lee also mentioned other software that was on the near horizon, such as Windows 8 (wikipedia) which can be used on 'slate' (or tablet) devices.

Lee also touched upon the important subject of recruitment.  Lee emphasised that it is important to encourage students to build apps and sell them through apps market places to create a portfolio which can be shown to potential employers.

The question and answer session was interesting.  There was some discussion about cross platform approaches to development and the fact that when you go cross platform, developers lose some functionality from the original host operating system of a mobile device (or phone).  The subject of native code versus multi-platform code was a debate that arose on a number of occasions throughout the day.  HTML 5 (wikipedia) was regularly mentioned, along with a platform such as PhoneGap (PhoneGap website).

Another tension that exists particularly when industry representatives and university representatives debate curriculum, is the difference between education and training.  Industry wants people who are fully trained (and ideally want universities to do this), but the real role of universities when it comes to technology (in my opinion) is to enable students to effectively know how best to learn and adapt to new tools and situations.  Lee made the point that the teaching of fundamentals is essential.  I agree.  Conveying principles through the use of vendor specific tools whilst presenting concepts in a general way to enable other technologies to be understood is a difficult thing to achieve.

Mobile application development: a journey thus far

Harin Sellakewa from the University of Buckingham gave a presentation that described how mobile technology came to be taught, in its current form, at Buckingham.  Harin described how some of the curriculum had changed and outlined the introduction of new modules.  The use of mobile technology had been explored by a number of various projects, including those that were funded by the EU.

Some of the key learning objectives of a module on mobile software was mentioned: how to design applications (or apps), understanding different components and learning about various guidelines and specifications.  All these learning objectives could then contribute to making an application that could be sold on the free market.

Harin also gave us a number of useful tips.  Any module must (of course) satisfactorily complement any existing modules, also aim to get people involved, speak to different vendors, start with student projects, attend training events that are run through industry and take the time to network.

A number of different topics were exposed through the question and answer session.  As well as a discussion about different technologies, an industry representative mentioned the importance of candidates having a portfolio of work to demonstrate to prospective employers.  One point that stuck in my mind was that an unfinished application has the potential to work against an applicant; showing something polished and complete is necessary.

Developing Apps in Schools

Aaron Peck teaches computing and ICT at the Royal Latin School, Buckingham, a school just around the corner from the university.  Aaron began by speaking about wider discussions about the GCSE computing curriculum, mentioning the OCR GCSE which was said to contain three key components: programming, a research project and an examination.

Aaron emphases fun and mentions the use of the MIT Scratch (Scratch website) environment.  He also went onto speak about mobile devices, a technology that the pupils are invariably likely to be familiar with.   Here lies an obvious collision of ideas: why not teach programming through the use of mobile devices?

Scratch has, of course, some distinct advantages - it is immediate and gets around the tyranny of fiddly syntax by providing students a graphical environment in which they can play.  Another programming environment that has a graphical world is the MIT App Inventor (App Inventor website) which allows users to create apps for Android phones.

Students are encouraged to create small projects, which may include a simple calculator, a recipe book or a hangman game.  The creation of apps has the potential to open up further discussion of wider issues, such as how such developments might be commercialised.  I remember an anecdote from Aaron, where he was asked by a student about how much an app programmer might earn; a testament to his ability to instil enthusiasm and engaging choices of technology.

There were some advantages to using App Inventor; it can be used on multiple development platforms, it is relatively simple to install and given that students may have used Scratch during earlier studies, making the graphical nature of the programming environment to be (potentially) more easily grasped by students.

Aaron isn't stopping at creating apps with App Inventor.  He mentioned his intention to try to work with Lego Mindstorms Robots through the Android SDK, where it might be possible to create a 'remote control' app using Bluetooth radio.  Aaron also mentioned that there was also opportunity to share the workings of HTML and Javascript with his students.  If my memory isn't playing tricks on me, I also seemed to recall that he mentioned that one of his students was inspired enough to use C++.

The question and answer session led us to subjects and technology such as Microsoft Kodu and Micrcosoft Gadgeteer.  Other important issues include addressing the gender imbalance, and how to motivate all student groups, including those who may not have a strong technical bias.

I really enjoyed this talk.  Two big parts of tech were familiar to me: Scratch (or as I know it, Sense), and App Inventor.  Both products are used as a part of different Open University computing modules, TU100 My Digital Life and TT284 Web Technologies.  It was an eye opener, for me, to see how these products could be used a way to inspire students at GCSE level. 

Mobile Assessment

The use of mobile technology to help teaching and learning seems to be a hot topic at the moment.  Joan Lu gave a presentation about the use of mobile technology for assessment and also mentions the use of student response systems making reference to an EU funded project entitled Do-IT.  Joan is from the XDIR research group at the University of Huddersfield which has carried out research  projects related to mobile technology.

Designing the mobile syllabus to enhance student employability

Yanguo Jing from London Metropolitan University gave a presentation about his first hand experiences of teaching about mobile technology to his postgraduate students.  It was a really interesting presentation that was packed with useful tips, not just about teaching but also about industrial engagement too.

Returning to the subject of multiple platforms and environments, Yanguo said that initially he tried to teach a little bit about all the major toolsets.  He came to the conclusion that this was less than ideal.  Although students might be given breadth, getting to the 'depth' is always a challenge.  It was decided, therefore, to focus on one particular platform and use the experience with the platform to make points that are important in other platforms too.  This is a very sensible practical decision; there is only so much detail that a lecturer can hold in his or her head at any one time.

Understanding mobile isn't just about understanding technology and the fundamentals of creating some executable code that runs on a device, it is also about understanding the surrounding business and economic area.  Connecting back to the ideal of creating marketable Apps that Harin touched upon in his earlier presentation, Yanguo said something about how he encourages his students to enter application competitions, or Appathons.  He also mentioned that students were also encouraged to attend an industry conference, DroidCon, to gain first hand experience about what is happening within industry.  It was interesting to hear that Yanguo is a part of an industry liaison group.  Not only does this facilitate a connection between academics and industry, it can also act as a connection between industry and students too.

Finally, it is also perhaps worth mentioning that Yanguo is helping to organise a related HEA event on mobile technology on 15 June 2012, entitled Workshop on Teaching and Learning Programming for Mobile and Tablet Devices.  It sounds like it's going to be a great event!

Programming with iOS

Gordon Eccleston from Robert Gordon University, Aberdeen shared some of his experiences of teaching using Apple's iOS.  This platform enabled students to learn something about HCI principles and also about object-oriented programming (through the use of Objective-C).

Gordon offered a key tip which echoed earlier discussions in the event.  He said, 'keep your modules as generic as possible'.  Inspiration and information that informed the creation of his module included looking at different text books and short courses that were designed for industry.  Studying the documentation provided by the vendor can be a very useful source of materials that can help to guide or inform the creation of aspects of a module.

Gordon spoke about lab based teaching (in a lab containing lots of Apple kit) and student course work.  Gordon then went onto present a brief overview of a number of different student projects.  The use of projects cannot be understated.  A good project connects the technology with broader issues of business and also helps to give the student some good materials that can be immediately demonstrated to a potential employer (I have this image of an interviewee handing their phone to an interviewer whilst saying, 'this is what I've done).  One project that stuck in my mind was an app that illustrated a fashion portfolio which demonstrates a connection between apps and marketing.

Gordon's session inspired a really interesting question and answer session.  One point was that PC (or Mac) based simulators are all very well, but it's also important (as well as rewarding) to allow students to run their software on actual devices (such as an iPod touch).  For one thing, it allows the developers to gain access to device only peripherals, such as accelerometers and other sensors that they wouldn't otherwise have access to.

Reflection of curriculum design and delivery in mobile computing

Khawar Hamed from the University of Staffordshire spoke about his experiences of curriculum design.  Khawar's presentation reminded me an app is at the top of a technology pyramid.  Along with the operating system of a device, apps are perhaps the most visible software artefact that users interact with.  Underneath the app and beyond the phone there is a sophisticated digital infrastructure that enables devices to work.  Some of the modules that Khawar mentioned allow students to begin to study these underlying technologies.  Another point is that mobility isn't just about technology, it's also about enabling organisations to achieve their objectives.

Khawar touched upon other issues such as the importance of getting the right name for a course or programme.  Since the names and phrases used to describe technology can change relatively quickly, perhaps the names of modules and programmes should be prepared change too?   An important point was to always seek industrial involvement wherever possible.  Connecting to this point, Khawar mentioned an organisation called The Wireless University Forum.

One really interesting debate that emerged from this presentation centred upon whether an institution should provide devices that students can transfer code to.  The answer was a resounding 'yes'.  Not everyone will have an Android phone, or an iPhone (or even a smartphone, although this is something that is changing).  Plus, providing a device delineates between what is a 'learning' device and what is a 'personal' device.

Mobile app development - creativity, skills and evidence

The final talk of the day was a second keynote.  Andrew Lapham, from Yell Labs gave an enthusiastic presentation about the work that his team carries out and what characteristics in potential employers he is looking for.  Key points include the ability to be creative and generate new and interesting ideas, strong communication skills (the ability to communicate those ideas and to persuade others of their merit), and an underlying enthusiasm for technology and what it might be able to achieve.

The notion of having a portfolio of evidence was also touched upon.  Whilst demonstration of apps or talking through a pet project is impressive, what is more impressive is having evidence that your own product or code has been marketed.  This might include having a blog about a product, and also gathering some evidence about how your customers view your product.

Reflections

There was one thing that surprised me about this day which was an exceptionally strong focus on apps.  In retrospect, it shouldn't have been a surprise at all.  Apps are the way to consume software on mobile devices.

I certainly sense that teaching programming for mobile devices isn't easy.  Each platform comes attached to ecology of tools (and a whole set of accompanying vocabulary) and techniques.  Teaching everything just isn't an option, but teaching in depth is surely the right way to go.  Educators will therefore have to choose a platform and figure out how to connect a technology choice to wider principles to enable graduates to more readily get to grips with the new environments they will inevitably face.

One really interesting question is whether mobility and the technology that goes with it is changing software engineering?  It's not a question seems to have an easy answer, but perhaps user based apps require different design methods than the lower level software that support the networking infrastructure and perhaps those who have stronger connections with the industry would be able to comment.

A final reflection relates to the creation of a portfolio that can help during the recruitment process.  The importance of a personal portfolio was emphasised in a recent HEA event at the University of Greenwich about gaming and animation.  Employers like to see what applicants have done.    Furthermore, it offers opportunities to allow employers to find out about the difficulties that applicants face and how they were overcome.

When it comes to being an app developer, the message was clear: a portfolio of well-crafted working apps was clearly something that employers would like to see.

Congratulations to Buckingham for running a fun and thought provoking event!

Permalink Add your comment
Share post
Christopher Douce

Using and teaching mobile technologies for ICT and computer science

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Mar 2014, 18:48

 I recently attended an event entitled Mobile Technologies - The Challenge of Learner Devices Delivering Computer Science held at Birmingham City University last week, organised by the Information and Computer Sciences (ICS) Higher Education Academy (HEA) subject centre.

This blog post aims to present a summary of proceedings as well as my own reflections on the day. If any of the delegates or presenters read this (and have any comments), then please feel free to post a reply to add to or correct anything that I've written. I hope these notes might be useful to someone.

Keynote

The day was kicked off by John Traxler from the University of Wolverhampton. Just as any good keynote should, John asked a number of searching questions. The ones that jumped out at me were whether information technology (or computers) had accelerated the industrialisation of education, and whether mobile technologies may contribute to this.

John wondered about the changing nature of technology ownership. On one hand universities maintain rooms filled with computers that students can use, but on the other hand students increasingly have their own devices, such as laptops or mobile phones. 

John also pointed us towards an article in the Guardian, published in July 2010 about teenagers and technology which has a rather challenging subtitle. Mobility and connectedness, it is argued, has now become a part of our identity.

One thing John said jumped out at me: 'requiring students to use a VLE is like asking them to wear a school uniform'. This analogy points towards a lot of issues that can be unpacked. Certainly, a VLE has the potential to present institutional branding, and a uniform suggests that things might done in a particular way. But a VLE also has the potential be be an invaluable source of information to ensure that we know what we need to know to navigate around an institution.

For those of us who had to wear school uniforms, very many of us customised them as much as we possibly could without getting told off for breaking the rules. Within their constraints, it would be possible to express individuality whilst conforming (to get an education). The notion of customisation and services also has a connection with the idea of a Personal Learning Environment (PLE) (wikipedia), which, in reality, might exist somewhere in between the world of the mobile, a personal laptop and the services that an institution provides.

Session I

The first session was opened by Kathy Maitland from Birmingham City University. Kathy talked about how she used cloud computing to enable students using different hardware to access different different software services. She spoke about the challenge of using different hardware (and operating system) platforms to access services and the technical challenges of ensuring correct configuration.

John Busch from Queen's university, Belfast made a presentation about how to record lectures using a mobile phone. It was great to see a (relatively) low tech approach being used to make educational materials available for students. All John needed to share his lectures with a wider audience was a mid range mobile phone, a tiny tripod, a desk to perch the mobile phone on, and (presumably) a lot of hard won experience.

John gave the audience a lot of tips about how to make the best use of technology, along with a result from a survey where he asked students how they made use of the recordings he made of his computer gaming lectures. 

A part of his talk was necessarily technical, where he spoke about different data encoding standards and which standard was supported by which mobile (or desktop) platform. One of the members of the audience pointed us to Encoding.com, a website that enables transcoding of digital media. The presentation gave way to interesting discussions about privacy. One of the things that I really liked about John's presentation was that is addressed 'mobile' from different perspectives at the same time: using mobile technology to produce content that may, in turn, be consumed by other mobile devices.

Laura Crane, from Lancaster University then gave an interesting presentation about using location, context and preference in VLE information delivery. Laura's main research question appeared to be, 'which is (potentially) more useful - it is information that is presented at a particular location, or information that is presented in a particular time?'

This reminded me of some research that I had heard of a couple of years ago called context modelling. Laura mentioned a subject or area that was new to me, namely, Situation Theory.  Laura's talk was very well received and it inspired a lot of debate. Topics discussed include the nature of mobility research, the importance of personal or learner attributes on learning (such as learning styles). Discussions edged towards the very active area of recommender research (recommender system, Wikipedia), and out to wider questions of combining location, recommender and affective interfaces (interfaces or systems that could give recommendations or make suggestions depending on emotion). A great talk!

Darren Mundy and Keith Dykes gave a presentation about the WILD Project funded by JISC. WILD is an abbreviation for Wireless Interactive Lecture Demonstrator. The idea behind the project is one that is simple and compelling: how to make use of personal technology to enable students to make a contribution to lectures. By contribution, I mean allowing students to add comments and text to a shared PowerPoint presentation.

A lecturer prepares a PowerPoint presentation and providing there is appropriate internet connectivity, there is a link to a WILD webpage, which the students can send messages to. This might be used to facilitate debate about a particular subject, but also enable those learners who are less reluctant to contribute to 'speak up' by 'texting out'. We were also directed towards the project source code.

During the talk, I was introduced to a word that I had never heard of before: prosumerism (but apparently Wikipedia had!). At the end of the talk, during the Q&A session, one delegate pointed us towards the SAP Twitter PowerPoint plug in, which might be able to achieve similar things.

This last presentation of the morning really got me thinking about my own educational practice, and perhaps this is one of the really powerful aspects of using and working learning technology: it can have the potential to encourage reflection about what is and what is not possible, both inside and outside the classroom. I tutor on an undergraduate interaction design course with the Open University, where I facilitate a number of face to face sessions.

Due to various reasons my tutorials are not well as attended as they could be. Students may have difficulty travelling to a tutorial session, they may have family responsibilities, or even have jobs at the weekend. This is a shame, since I sense that some students would really benefit from these face to face sessions. The WILD presentation make me wonder whether those students who attend a face to face tutorial might be able to collectively author a summary PowerPoint that could then be shared with the group of students who were unable to attend. Interactivity, of course, has the potential to foster inclusivity and ownership. Simply put, the more a student does within a lecture (or puts into it) the more they may get out of it.

Session II

After lunch, the second session proved to be slightly more technical. The first half was merely a warm up!

The second session kicked off with demonstration by Doug Belshaw. Doug works for JISCInfoNet. This part of JISC aims to provide information and products known as InfoKits which can be used by senior management to understand and appreciate a range of different education and technology issues. We were directed towards examples, such as effective practice in a digital age, and effective assessment in a digital age. A new kit, entitled JISC mobile and wireless technologies review is currently under presentation.

Doug asked the audience to share information about any case studies. A number of projects were mentioned, along with a set of links. During the discussion part of the demo we were directed towards m.sunderland.ac.uk , and this makes me wonder whether the 'm.' is a convention that I'm not aware of (and perhaps ought to be!) Something called iWebKit was also mentioned. Other projects included MyMobileBristol.com, in collaboration with Bristol University and Bristol City Council. For more information visit the m.bristol.ac.uk site.

There was also a mention of a service provided by Oxford University, m.ox.ac.uk (the project also has an accompanying press release) This service appears to have been developed in association with something called the Molly Project, which seems to be a mobile application development framework. There was a lot to take in!

Gordon Eccleston from Robert Gordon University in Aberdeen gave a fabulous presentation about his work teaching programming the iPhone. Having remained steadfastly in the desktop world, and admitting to being a laggard on the mobile technology front, Gordon answered many questions that I have always had about how one might potentially begin to write an iPhone application. Gordon introduce us to the iPhone software development kit, which I understand was free to universities. The software used to create Apps is called Xcode.  Having predominantly worked within a PC software development environment for too many years than I would care to admit, a quick poke around the Apple Tools website looked rather exciting; a whole new world of languages, terms and technologies.

Gordon had a number of views about the future of App development. He thought that XHTML 5, CSS 3 and accompanying technologies would have an increasingly important role to play. On a related note, the cross mobile platform PhoneGap was mentioned during the following presentation which makes use of some of these same technologies. (Digging further into the web, there's a Wikipedia page called Multiple phone web based application framework, which might prove to be interesting.  There was also some debate about which platform mobile might dominate (and whether mobile dominance may depend on whether how many Apple stores there may be within a particular city or country!)

Gordon also briefly talked about some of the student project he has been involved with. A notable example was an iPhone app for medical students to learn ophthalmology terms and concepts. There were some really good ideas here; how to create applications that have direct benefit to learners by the application of mobile technology through learning how they can be developed.

Karsten Lundqvist from the University of Reading offered technology balance to the day by presenting his work teaching the development of Android applications. Karsten began his presentation by considering the different platforms: iPhone, RIM, and Android, but the choice of platform was ultimately decided by the availability of existing hardware, namely, PC's running Windows or Linux. In place of using Xcode, Java with Eclipse was used. I seem to remember that students may have had some experience using C/C++ before attending the classes, but I can't quite remember.

The question and answer session was really interesting. One delegate asked Karsten whether he had heard of something called the Google Android App Inventor, another mobile software development platform. It was also interesting to hear about the different demo apps. Karsten showed us a picture of a phone in a mini-segway cradle, demonstrating the concept of real-time control, there was also a reference to an app that may help people with language difficulties, and Karsten pointed us to his own website where he has been developing a game template by means of a blog tutorial.

Towards the end of Karsten's session, I recall an echo from the earlier HEA employability event which explored computing forensics. One of the ideas coming from this event was that perhaps it might be a good idea for institutions to share forensic data sets. An idea posed within this event was that perhaps institutions might be able to share application ideas or templates, perhaps for different platforms. Some ideas might include fitness utilities, 'finding your way around' apps (very useful: I still remember my days being a confused fresher during my undergrad days!), simple game templates, and flash card apps to help students to learn a number of different concepts.

Plenary

The plenary discussion was quite wide ranging, and is quite difficult to down to a couple of paragraphs. My own attempt at making sense of the day was to understand the key topics in terms of 'paired terms', which might be either subject dimensions or tensions (depending on how you look at it).

VLEs and apps: different software with different purposes, which connect to the idea of information and content. Information might be where to go to find a lecture theatre, or the location of a bank, and content is a representation of the course materials itself.

Ownership and provision: invariably students will have their own technology, but to what extent should an organisation provide technology to facilitate learning? Provision has been historically thought of in terms of rooms filled with computers, and necessarily conservative institutional IT provision (to make sure that everything keeps working). Entwined with these issues is the notion of legacy information and the need for institutions (and learners) to keep up with technology.

Development and usage: where does the information or content come from? To what extent might consumers of mobile information potentially participate in the development of their own content? Might this also create potential dangers for institutions and individuals. This is related to another tension of control, namely, institutional versus individual control, of either information, content or technology.

Guidance and figuring things out: when it comes to learning, there is always a balance to be reached between providing just enough guidance that enables learners to gain enough information so that they find the information that they need. On one hand, there may be certain apps that facilitate learning in their own right, apps that provide information, and apps that may present content held within a VLE. One idea might be that we may need a taxonomy of uses for both an institution and an individual.

Industry and academia: a two way relationship. We must provide education (about mobile) that industry needs, and also make use of innovations coming from industry, but also we have a role to innovate ourselves and potentially feedback into industry. (I seem to recall quite a few delegates mentioning something called mCampus, but I haven't been able to uncover any information about it!)

Other discussion points that were raised included the observation that location-based information provision is new, and the need to interact with people is one of the things that is driving the development of technology. A broader question, posed by John Traxler was, 'does mobile have the potential to transform teaching and learning?' Learners, of course, differ very widely in terms of their experience and attitude to interactive products.

Points such as accessibility, whether it being availability of technology or ability to perceive information through assistive technologies are also substantial issues. The wider organisational and political environment is also a significant factor when it comes to the development of mobile applications, and their subsequent consumption.

Footnote

All in all, a very enjoyable day! As I travelled into Birmingham from London on the train on the morning of the event my eye caught what used to be the site of an old industrial centre. I had no idea what it used to be. I could see the foundations of what might have been a big factory or a depot. I was quite surprised to discover that Millenium Point building also overlooked the same area.

Walking to the train station for my return journey to London, I thought, 'wouldn't it be great if there was an app that could use your location to get articles and pictures about what used to be here before; perhaps there could be a timeline control which users could change to go back in time to see what was there perhaps twenty, thirty or even one hundred years before'. I imagined a personal time machine in the palm of your hand. I then recalled a mash-up between Google Maps and Wikipedia, and had soon uncovered something called Wikimapia.

Like so many of these passing ideas, there's no such thing as an original thought. What really matters is how such technology thoughts are realised, and the ultimate benefit they may have to the different sets of end user.

Permalink
Share post
Christopher Douce

Personalising museum experience

Visible to anyone in the world
Edited by Christopher Douce, Wednesday, 21 Jul 2010, 17:48

Thyssen-Bornemisza museum, Madrid

 Last year has been a fun year.  At one point I found I had a number of hours to kill before I caught an onward travel connection.  Since I was travelling through a city, I decided to kill some time by visiting some museums.

I have to confess I really like museums.  My favourite type is science and engineering museums. I really like looking at machines, mechanisms and drawings, learning about the people and situations that shaped them.  I also like visiting art museums too, but I will be the first to confess that I do find some of the exhibits that they can contain a little difficult to understand.

Starting my exploration

I stepped into the Thyssen-Bornemisza museum (wikipedia) with mild trepadation, not really knowing what I was letting myself in for.  After the entrance area I discovered a desk that was renting audio guides.  Since I felt that I might be able to gain something from the use of an audio guide (and since I was travelling alone, it could offer me some company), I decided to rent one for a couple of hours.

With my guide in hand I started to wander around the gallery.  The paintings appeared to be set out in a very particular and deliberate way.  The gallery designer was obviously trying to tell me something about the history of art (of which I know next to nothing about).  The paintings gradually changed from impressionism, to modernism, through to paintings that I could only describe as thoroughly abstract (some of which I thoroughly liked!)

Extending my guide

I remember stopping at a couple of paintings at the impressionist section.  The disembodied voice of my guide was telling me to pay attention to the foreground, and the background: particular details were considered to be important.  I was given some background information, about where the painter was working and who he was working with.

On a couple of occasions I felt that I had been told a huge amount of detail, but I felt that none of it was sticking.  I didn't have a mental framework around which to store these new facts that I was being presented with.  Art history students, on the other hand, might have less trouble.

What I did discover is that some subjects interested me significantly more than others.  I wanted to know which artists were influenced by others.  I wanted to hear a timeline of how they were connected.

I didn't just want my guide to tell me about what I was looking at, I wanted my audio guide to be a guide, to be more like a person who would perhaps direct me to things that I might be interested in looking at or learning about.  I wanted my audio guide to branch off on an interesting anecdote about the connections between two different artists, about the trials and tribulation of their daily lives.  I felt that I needed this functionality not only to uncover more about what I was seeing, but also to help me to find a way to structure the information that I was hearing.

Alternative information

Perhaps my mobile device could present a list of topics of themes that related to a particular painting.  It might display the name of the artist, some information about the scene that was being depicted, perhaps some keywords that correspond to the type under which it could be broadly categorised.

Choosing these entries might direct you to related audio files or perhaps other paintings.  A visitor might be presented with words like, 'you might want to look at this painting by this artist', followed by some instructions about where to find the painting in the gallery (and its unique name or number).

If this alternative sounded interesting (but it wasn't your main interest) you might be able to store this potentially interesting diversion into a 'trail store', a form of bookmark for audio guides.

Personalised guides

Of course, it would be much better if you had your own personal human guide, but there is always the fear of sounding like an idiot if you ask questions like, 'so, erm, what is impressionism exactly?', especially if you are amongst a large group of people!

There are other things you could do too.  Different visitors will take different routes through a gallery or museum.  You might be able to follow the routes (or footsteps) that other visitors have taken.

Strangers could be able to name and store their own routes and 'interest maps'.  You could break off a route half way through a preexisting 'discovery path' and form your own.  This could become, in essence, a form of social software for gallery spaces.  A static guide might be able to present user generated pathways through gallery generated content.

Personal devices

One of the things I had to do when I explored my gallery was exchange my driving licence for a piece of clumsy, uncomfortable mobile technology.  It was only later that it struck me that I had a relatively high tech piece of mobile technology in my pocket: a mobile phone. 

To be fair, I do hold a bit of fondness for my simple retro Nokia device, but I could imagine a situation where audio guides are not delivered by custom pieces of hardware, but instead streamed directly to your own hand held personal device.  Payment for a 'guide' service could be made directly through the phone.  Different galleries or museums may begin to host their own systems, where physical 'guide access posters' give users instructions about how visitors could access a parallel world of exploration and learning.

Rather than using something that is unfamiliar, you might be able to use your own headphones, and perhaps use your device to take away souvenirs (or information artefacts) that relate to particular exhibits.  Museums are, after all, so packed with information, it is difficult to 'take everything in'.  Your own device may be used to augment your experience, and remind you of what you found to be particularly interesting.

Pervasive guides

If each user has their own device, it is possible that this device could store a representation of their own interests or learning preferences.  Before stepping over the threshold of a museum, you might have already told your device that you are interested in looking at a particular period of painting.  A museum website might be able to offer you some advice about what kinds of preferences you might choose before your visit.

With the guide that I used, I moved between the individual exhibits entering exhibit numbers into a keypad.  Might there be a better less visible way to tell the guide device what exhibits are of interest?

In museums like Victoria and Albert and the Natural History Museum, it takes many visits to explore the galleries and exhibits.  Ideally a human guide would remember what you might have seen before and what interests you have.  Perhaps a digital personalized guide may able to store information about your previous visits, helping you to remember what you previously studied.  A digital system might also have the power to describe what has changed in terms of exhibits if some time has elapsed between your different visits.  A gallery may be able to advertise its own exhibits.

Challenges

These thoughts spring from an idealised vision of what a perfect audio (or mobile) guide through a museum or gallery might look like.  Ideally it should run on your own device, and ideally it should enable to learn and allow you to take snippets or fragments of your experience away with you.   In some senses, it might be possible to construct a museum exhibit e-portfolio (wikipedia), to store digital mementoes of your real-world experiences.

There are many unsaid challenges to realise a pervasive personalized mobile audio guide.  We need to understand how to best create material that works for different groups of learners.  In turn, we need to understand how to best create user models (wikipedia) of visitors.

Perhaps one of the biggest challenges may lie with the creation of a standards-based interoperable infrastructure that might enable public exhibition spaces to allow materials and services to be made available to personal hand held devices.

Acknowlegement: image from Flickr by jonmcalister, licenced under creative commons.

Permalink
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 1976395