I think I must have been about 10 years of age when I first heard about the magical device that was the Logo turtle (and the mysterious notion of turtle geometry). Fast forward two decades and about ten years ago I managed to find a copy of Papert's book Mindstorms in a second hand bookshop in Brighton (a part of me things that it must have been owned by someone who studied cognitive science at the nearby University of Sussex). I had these two thoughts in mind whilst I was travelling to the University of Derby on the 28 November to attend the HEA BotShop.
This is a quick summary of my own views of the day along with my own take on some of the different themes that emerged. In fact, there was quite a bit of commonality between this event and the HEA Open University event that was held a week earlier, but more of that later. In case you're interested, Derby produced a press release for this event which highlights some of the areas of interest, which include neural networks, embedded systems and artificial intelligence. We were told that this was pounced upon by the local press, proof that mention of robots has a constant and wide appeal.
The day was introduced by Clive Rosen and Richard Hill, subject head for Computing and Mathematics at the University of Derby, jointly welcomed everyone to the workshop. The first presentation of the day was by Scott Turner from the University of Northampton who gave a presentation entitled Neurones and Robots. Scott uses the Lego Mindstorms hardware to introduce some of the fundamental concepts of neural networks, such as how learning takes place by the changing of some of the numerical values that are used within a network. Some of the applications of these robots include following lines, getting robots to follow each other and getting robots to avoid obstacles; seemingly simple actions which enable underlying principles to be both studied and demonstrated.
One of the questions to Scott was, 'what does using a robot add?' The answer was simple: it can add enjoyment and engagement. Although similar principles could be studied through the application of other tools, such as Excel spreadsheets, simple robots add a degree of immediacy and physicality that other approaches cannot. These discussions made me consider whether there is a pedagogic dimension (in terms of educational tools) that has emulation on one end and physical devices on the other, and the extent to which we may wish to consider a balance between the two.
Scott's presentation was followed by Martin Colley from the University of Essex whose presentation was entitled Embedded Systems Teaching. Having been a postgraduate student at Essex I found Martin's presentation especially interesting. Although I didn't have direct exposure to the robotics labs during my time at Essex I remember that some of my peers made good use of the research and teaching facilities that the University provided.
Martin gave a quick history of robotics and embedded systems at Essex where he talked about the design and evolution of different devices which culminated in the development of a kit. The design of the kits had changed along with the development of processors. Early kits (and robots) made use of the Motorola 68k processor (famed for its use in the original Apple Mac) whereas the current generation of kits made use or ARM processors which are, of course, commonplace with mobile phones and other embedded devices.
One aspect of Martin's kits that I really liked was the range of different input and output devices. You could write code to drive a 160x128 colour display, you could hook up a board that had light and ultrasound sensors, or you could even connect a memory card reader and a digital to analogue converter to enable students to write their own MP3 player. Martin also touched upon how the hardware might be used to explore some of the challenges of programming, which includes the use of C and C++, how to work with a real-time clock, the use of interrupts and direct memory access buffers. Debugging was also touched upon, which, in my opinion is a really important topic when 'students get down to the muddy later between the hardware and software'. Plus, all these interesting peripherals are so much fun than simply having an embedded system to turn an LED on or off.
All in all, a really interesting presentation which gave way to a discussion about the broader challenge of teaching programming. One comment that it isn't programming per se that is the main problem. Instead, it is development the skills of algorithmic thinking, or knowing how to solve problems that represents the biggest challenge.
Using bots to teach programming
The third presentation of the day was by Mark Anderson and Collette Gavan from Edge Hill University who described how the broad idea of robotics has been used to teach fundamentals of programming and how some students have learnt to build their own devices. Mark and Collette had a surprise in store. Rather than having to do most of the talking themselves they brought along a number of students who shared with us their own experiences. This was a great approach; personal accounts of experience and challenges are always useful to hear.
Mark and Collette's slot covered a broad range of technology in what was a short period of time. They began with describing how they made use of Lego Mindstorms kits (using the NXT brick) with Java, used in the first year of study. During the second year students move onto Arduino kits, where student negotiate the terms of their own projects. I seem to remember hearing that students are at liberty to create their own input controllers, which might be used in collaboration with an embedded arcade game, for instance. There was reference to a device called the Gameduino which allows the Arduino controller to be connected up to a video display. Not only can I see this as being fun, I can also see this as being pretty challenging too!
Towards the end of the session there was a question and answer and answer session where another introductory to programming tool called Alice was mentioned. There were two overriding themes that came from this session. The first was that learning to program (whether it is an embedded device or a computer) isn't easy. The second is that it's possible to make that learning fun whilst developing essential non-technical skills such as team working.
One of the really interesting things about robotics is that a broad array of disciplines can come into play. One of the most important of these is engineering, particularly electronic and mechanical engineering. I guess there's a 'hard side' and a 'soft side' to robotics. By 'hard side' I mean using the concept of robots to teach about the processes inherent in their design, construction and operation. The 'soft side', on the other hand, is where robots can be used to teach problem solving skills and introduce the fundamentals of what is meant by programming. Tony Wilcox from Birmingham City University, who is certainly on the 'hard side' of this continuum, gave a fabulous presentation which certainly gave us software engineers (and I'm talking about myself here!) a lot to think about.
A micromouse is a small buggy (or autonomous robot) that can explore a controlled maze (in terms of its dimensions) and figure out how to get to its centre. It was interesting to hear that there is such a thing as a micromouse competition (I had heard of robot football, but not mice competitions before!) Different teams produce different mice, which then compete in a maze finding challenge. Tony uses his 'mice' to expose a number of different subjects to students.
Thinking about the problem for a moment we begin to see a number of different problems that we need to address, such as, how do we control the wheels and detect how far the mouse has travelled? What sensors might be used to allow the mouse to discover the junctions in a maze? What approaches might we used to physically design elements of the mouse? How might we devise a maze solving algorithm? Tony mentioned a number of subject areas that can help to solve these problems: closed loop process control (for the control of the wheels) and mechanical engineering, 3D computer aided design, power electronics, and PIC programming (which is done using C). I'm sure there are others!
It was interesting to hear Tony say something about programming libraries. Students are introduced to libraries in the in the second year of his teaching. Whilst libraries can help you to do cool stuff, you need to properly understand the fundamentals (such as bit shuffling and manipulation) to use them properly. To best teach the fundamentals, you ideally need to do cool stuff!
One thing that I took away with me was that robot control and maze solving software can exist within 32K. I was reminded that even though there is a lot of commonality between programming on embedded devices and programming applications for a PC they can be almost different disciplines.
The micromouse robots operate within a controlled physical environment. Another way to make a controlled environment is to make one using software. The advantages of using software is that you can control everything. You can, of course, define your own laws of physics and control time in any ways that you want. In a way, this makes things both easier and difficult for us software engineers: we've got the flexibility, but we've got to figure out the boundaries of the environment in which we work all the time.
Dave Voorhis from the University of Derby talked about his work on 'emergent critters'. By critters, Dave means simple 'virtual animals' which hunt around in an environment for food, bumping into others, whilst trying to get to an exit. There are some parallels with the earlier talks on robotics, since each critter has its own set of inputs and outputs. Students can write their own critters which can interact with others. There is parallel to core wars, a historic idea where programmers write programs which fight against each other.
Dave said that some students have written critters which 'hack' the environment, causing the critter world to exhibit unusual behaviour. Just as with the real world, working against some of the existing laws (and I'm talking those that are socially constructed rather than the other immutable variety) runs the risk of causing unintended consequences!
There were two presentations in the final session of the day. The first was by Steve Joiner from Coventry University about how robotics can be used to help teach mathematics to key stage 3 and 4. Steve is a maths graduate and is a part of the Centre of Excellence in Mathematics and Statistics Support (SIGMA), in collaboration with Loughborough University. Steve showed us a number of videos where students had built robots out of the Lego Mindstorms NXT kit. Projects included navigation through sensors, launching projectiles (albeit small ones!) and predicting their behaviour, and applications of graph theory. Steve also demonstrated a self-balancing Lego robot (basically, a bit like a mini-segway) that made use of ultrasonic sensors. Steve made the good point that through the use and application of kits and projects, students can begin to see the reason behind why particular subjects within mathematics are useful.
The final presentation was an impromptu one by Jon Rosewell, from the C&S department at the Open University. Jon has been heavily involved with a short module called T184 Robotics and the meaning of life which has just come to the end of its life. Like so many of the earlier presentations, this module makes use of Lego Mindstorms to introduce the concept of programming to students. Instead of using the native Lego software, the Open University developed its own programming environment (for the earlier Lego Mindstorms blocks) which is arguably easier to work with. Students are not given their own robotics kit, but may use their own if they have one available. If they don't have access to a kit, students can make use of a robotics simulator, which is a part of the same software package.
Towards the end of the day there was some talk about a new Open University undergraduate module called TU100 My Digital Life. This module makes use of a 'home experiment kit' which contains an Arduino processor, which was mentioned by Mark Anderson earlier in the day. Whilst the TU100 senseboard, as it is called, cannot directly become a robot, it does contains some of the necessary features that enable students to understand some of the important ideas underpinning robotics, such as different types of input sensors, and output peripherals such as motors.
At the end of the day there was sufficient time to have an open discussion about some of the themes that had emerged. Clive Rosen kicked off the discussions by saying that robots can help teaching become engaging. I completely agree! One of the difficulties of teaching subjects such as mathematics and computing is that it can often take a lot of work to get satisfying and tangible results. Using robots, in their various forms, allow learners to more readily become engaged with the subject that they are exploring. In my own eyes there are two key things that need to be addressed in parallel: what subjects the application of robots can help to explore, and secondly, how to best make use of them to deliver the most effective pedagogic experience.
These ruminations connect to a plenary discussion which related to the teaching of computing and ICT at school. There was a consensus that computing and ICT education needs to go so much further than simply teaching how students to make use of Microsoft Office, for instance. We were all directed to a project called Computing at School and a relatively new education (and hardware) project called RaspberryPi was mentioned. I'm personally looking forwards to how this project develops (and hopefully being able to mess around with one of these devices!)
There was some debate about individual institutions 'doing their own thing', in terms of building their own teaching hardware, raising the question of whether it might be possible to collaborate further (and the extent to which the higher education hardware might potentially be useful in the school environment). It was concluded that it isn't just a matter of technology it may be more of a matter of education policy.
In the same vein, it was hypothesised that perhaps the embedded processors within students' mobile telephones might (potentially) be used to explore robotics, perhaps by plugging in a phone to 'extension kit'. Another point was that little was discussed about fixed or industrial robots, which is almost a discipline in its own right, having its own tools, languages and technologies. This is a further example of how robotics can connect to other subjects such as manufacturing.
Thinking back to the event, there are a number of other themes that come to mind. Some of these include the role of simulation (and how this relates to physical hardware), the extent to which we either buy or build our hardware (which might depend on our pedagogic objectives), and the types of projects that we choose.
Through the use of robots students may more directly appreciate how software can change the world. Robotics is a subject that is thoroughly interdisciplinary. I've heard artificial intelligence (AI) described as applied philosophy. Not only can robotics be used to help to study AI, it also has the potential to expose learners to a range of different disciplines, such as mathematics, electronics, physics, engineering and the fundamentals of computing.
Learning how to program a robot is not just about programming. Instead, it is more about developing general problem solving skills, developing creativity and becoming more aware at how to conduct effective research.