OU blog

Personal Blogs

Christopher Douce

Enhancing Employability of Computing Students

Visible to anyone in the world
Edited by Christopher Douce, Monday, 3 Mar 2014, 18:49

I was recently able to attend what was the first Higher Education Academy (HEA) event that explicitly aimed to discuss how universities might enhance the employability of computing courses.  The intention of this blog is to present a brief summary of the event (HEA website)  and to highlight some of the themes (and issues) that I took away from it.

The day was held at the University of Derby enterprise centre and was organised on behalf of the HEA Information and Computer Sciences subject group.  I had only ever been to one HEA event before, so I wasn't quite sure what to expect.  This said, the title of the workshop (or mini-conference) really interested me, especially after having returned to the higher education sector from industry.

The day was divided into two sets of paper presentations punctuated by two keynote speeches.  The afternoon paper sessions was separated into two streams: a placements workshop and a computing forensics stream.  Feeling that the placements workshop wasn't really appropriate, I decided to sit in on the computing and forensics stream.

Opening Keynote

The opening address was given by Debbie Law, an account management director at Hewlett Packard.  As well as outlining the HP recruitment process (which sounds pretty tough!) Debbie mentioned that through various acquisions, there was a gradual movement beyond technology (such as PCs and servers) through to the application of services.  Business, it was argued, don't particularly care for IT, but they do care for what IT gives them.

So, what makes an employable graduate?  They should be able to do a lot!  They should be able to learn and to apply knowledge (completing a degree should go some way to demonstrating this).  Candidates should demonstrate their willingness to consider (and understand) customer requirements.  They should also demonstrate problem solving and analytical skills and be able to show a good awareness of the organisations in which they work.  They should be performance driven, show good attention to detail (a necessity if you have ever written a computer program!), be able to lead a team and be committed to continuous improvement and developing personal effectiveness. Phew!

I learnt something during this session (something that perhaps I should have already known about).  I was introduced to something called ITIL (Information Technology Infrastructure Library) (wikipedia).  ITIL was later spoken about in the same sentences as PRINCE (something I had heard about after taking M865, the Open University Project Management course).

First paper session

There were a few changes to the published programme.  The first paper was by McCrae and McKinnon : Preparing students for employment through embedding work-related learning.  It was at this point that the notion of employability was defined as: A set of attributes, skills and knowledge that all labour market participants should possess to ensure they have the capability of being effective in the workplace - to the benefit of themselves, their employer and the wider economy.  A useful reference is the Confederation of British Industry's Fit for the Future: preparing graduates for the world of work report (CBI, 2009).

The presentation went on to explore how employability skills (such as team working, business skills and communication skills) may be embedded within the curriculum using an approach called Work Related Learning (WRL).  The underpinning ideas relate to linking theory and practice, using relevant learning outcomes, widening horizons, carrying out active learning and taking account of cultural diversity.  A mixed methodology was used to determine the effectiveness of embedding WRL within a course.

The second paper was by Jing and Chalk and was entitled: An initiative for developing student employability through student enterprise workshops.   The paper outlined one approach to bridge the gap between university education and industry through a series of seminars over a twelve week period given by people who currently work within industry.  A problem was described where there were lower employment rates amongst computing graduates (despite alleged skills shortages), low enrolment to work placement years (sandwich years), lack of employability awareness (which also includes job application and interview skills).

The third presentation was by our very own Kevin Streater and Simon Rae from the Open University Business School.  Their paper was entitled 'Developing professionalism in New IT Graduates? Who Needs It?'  Their paper addressed the notion of what it may mean to be an IT professional, encouraging us to look at the British Computer Society Chartered IT Professional status (CITP) (in addition to the ITIL and Prince), and something called the Professional Maturity Model (which I had never heard of before).

Something else that I had never heard of before is the Skills Framework for the Information Age (SFIA).  By using this framework it was possible to uncover whether new subjects or modules may contribute to enhancing the degrees of undergraduates who may be studying to work within a particular profession.  Two Open University courses were mentioned: T122 Career Development and Employability, and T227 Change, Strategy and Projects at Work.

This final presentation of the morning was interesting since it asked us to question the notion of professionalism, and presented the viewpoint that the IT profession has a long way to go before it could be considered akin to some of the other more established professions (such as law, engineering and accountancy).

During the morning presentations I also remember a reference to E-Skills, which is the Sector Skills Council for Business and Information Technology, a government organisation that aims to help to ensure that the UK has the IT skills it needs.

Computing and Forensics Stream

This stream especially piqued my interest since I had once studied a postgraduate computing forensics course, M886, through the Open University a couple of years ago.

The first paper was entitled Teaching Legal and Courtroom Issues in Digital Forensics by Anderson, Esen and Conniss.  Like so many different subject, both academic and professional skills need to be applied and considered.  Academic education considers the communication of theories and dissemination of knowledge, and learning how to think about problems in a critical way by analysing and evaluating different types and sources of information.

The second paper was about Syllabus Development with an emphasis on practical aspects of digital investigation, by Sukhvinder Hara, who drew upon her extensive experience as working as a forensic investigator.

The third paper was about how a virtualised forensics lab might be established through the application of cloud computing.  I found this presentation interesting for two reasons.  The first was due to the interesting application of virtualisation, and secondly due to a resonance with how parts of the T216 Cisco networking course is taught, where students are able to gain access to physical hardware located within a laboratory just by 'logging on' to their personal computer or laptop.

The final paper of the day was an enthusiastic presentation by David Chadwick who shared with us his approach of using problem-based learning and how it could be applied to computing forensics.

This final session of the day brought two questions to my mind.  The first related to the relationship between teaching the principles of computing forensics and the challenge of providing graduates who know the tools that are used within industry.  The second related to the general question of, 'so, how many computing forensics jobs are there?'

It stuck me that a number of the forensics courses around the UK demonstrate the use of similar technologies.  I've heard two products mentioned on a number of occasions: EnCase (Wikipedia) and FTK (Wikipedia), both of which are featured within the Open University M889 course.  If industry requires trained users of these tools, is it the remit of universities to offer explicit 'training' in commercial products such as EnCase .  Interestingly, the University of Greenwich, like the Open University (in T216 course), enables students to study for industrial certification whilst at the same time acquiring credit points that can count towards a degree.

So, are there enough forensics jobs for forensics graduates?  You might ask a very similar question which also begs an answer: are there enough psychology jobs for the number of psychology graduates?  I've heard it say that studying psychology introduces students to the notion of evidence, different research methodologies and research designs.  It is a demanding subject that requires you to write in a very clear way.  Studying psychology teaches and develops advanced numeracy and literacy as much as it introduces scientific method and the often confusing and complex nature of academic debate. 

Returning to computing forensics, I sensed that there might not be as many jobs in the field as there are graduates, but it very much depends what kind of job you might be thinking of.  Those graduates who took digital forensics courses might find themselves working as IT managers, network infrastructure designers or software developers as opposed to purely within law enforcement.  Knowing the notion of digital evidence and how to capture it is an incredibly important skill irrespective of whether or not a student becomes a fully fledged digital investigator.

Concluding Discussions

One of the best parts of the day was the discussion section.  A number of tensions became apparent.  One of the tensions relates to what a university should be and the role it should play within wider society.  Another tension is the differences that exist between the notions of training and education (and the role that universities play to support these two different aims).

Each organisation and area of industry will have a unique set of training and educational requirements.  There are, of course, more organisations than there are universities.  A particular industry may have a very specific training problem that necessitates the development of educational materials that is particular to its own context.  Universities, it can be argued, can only go so far in meeting very particular needs.

A related question, is of course, the difference between training and education.  When I worked in industry there were some problems that could be only solved by gaining task specific skills.  Within the field of software development this may be learning how to use a certain compiler or software tool set.  Learning a very particular skill (whilst building upon existing knowledge) can be viewed as training.  An engineer can either sit with a user manual and a set of notes and figure things out over a period of a month or two, or alternatively go on an accelerated training course and learn about what to do in a matter of days.

Education, of course, goes much deeper.  Education is about not just knowing have to use a particular set of tools but its about knowing how to think about your tools (and their limits) and understanding how they may fit within the 'big scheme of things'.  Education is also about learning a vocabulary that enables you to begin to understand how to communicate with others who work within your discipline (so you can talk about your tools).

Within the ICT sector the pace of change continues to astonish me.  There was a time when universities in conjunction with research organisations led the development of computing and computer science.  Meanwhile, industry has voraciously adopted ICT in such a way that it pretty much pervades all our lives.

So, where does this leave degree level education when 'general' industry may be asking for effective IT professionals?  It would be naive to believe that the university sector can fully satisfy the needs of industry since the nature of industry is so diverse.  Instead, we may need to consider how to offer education and learning (which the university sector is good at) which leads towards the efficient consumption of training (which satisfies the need of industry).   This argument implies that the university sector is for the 'common good' as opposed to being a mechanism that allows individuals to gain specialist topic specific knowledge that can immediately lead to a lucrative career.  Becoming an ICT professional requires an ability to continually learn due to perpetual innovation.  A university level education can provide a fabulous basis to gain an introduction into this rapidly challenging world.

Permalink Add your comment
Share post

Comments

Usless IT courses in Unis

Hi, I am Roman, I do art.  I never buy HP printers, they too expensive to run.  It seems HP does not know that.  I worked in IT for 10 years, I quit it and went  art full time (I am not a pensioner)


In my experience learning IT in university is a waste of time. I did full degree and still had to learn all from scratch in job. The unis are to far back in their curriculums to teach anything useful. The Art degree is as good, with 2 week course in specific IT area which business needs and then your colleagues teach you rest. HP recruiting  requirements is a laugh, for what? All you need is a good person with open mind.

My advice drop IT courses which teach nothing useful, go instead towards something which teach you to think, do 2 week course in IT and voila. Roman