Computing and Communications: 2023 Research Fiesta
Tuesday, 31 Jan 2023, 08:42
Visible to anyone in the world
Edited by Christopher Douce, Tuesday, 31 Jan 2023, 17:02
On 25 January 2022 I attended an event called the School of Computing and Communications Research Fiesta, which took place on the university campus. One of my reasons for attending the fiesta was to try to restart my research activities, having stepped away from research due to taking on a role called ‘lead staff tutor’ for the last three years.
This event was advertised as a “… time for us to reconvene and discuss everything research. This event is aimed to help us (re-)connect with one another and understand how we can help and benefit from each other’s research expertise and outputs” and was facilitated by David Bush from Ascolto.
What follows is a summary of the Research Fiesta, in terms of what happened during the meeting, and what I felt the biggest take away points were. This blog may of be interest to anyone who was at the event, couldn’t make it to the event, or broadly interested in the process of research (whether computing research, or research that takes place within other disciplines).
Preparation
Before the event, we were asked to prepare some cards which summarised our research interests. Although I didn’t write the card in advance, I did come to the event with some ideas in mind. Here’s what I wrote down on three cards:
Understanding and characterising green computing: what it is, what the boundaries and problem are, and how can we embed this theme into our teaching?
Storytelling, soft skills, and software engineering: what role does storytelling play or could play in software engineering practice, and how might storytelling be used to develop soft skills in the next generation of computing graduates?
Accessibility of web technologies: how accessible are the current generation of web-based applications, and to what extent are hybrid apps accessible with assistive technology. How useful is WAI-ARIA? It is still useful? Does it have an impact?
Later during the session, I added two more cards:
Pedagogy of teaching programming at a distance: innovative tutorials; how to develop tutors, and how to help them to be creative, perhaps by embedding and using drama.
Development of writing skills across the computing curriculum.
This final idea emerged from discussions with tutors, following some discussions with tutors, and might form the basis of a scholarship project. The university has prepared a lot of materials about writing; the question is whether the computing programme makes effective use of them, given the writing requirements from some courses.
Activity 1: Sharing research ideas
Our first activity could be loosely called “academic speed dating”.
I’ve done this before (both the academic version, and the non-academic version).
In this version, we were sent to various tables, where we met up with two other colleagues. Our task was to show our cards (our research ideas) and try to create a new card that combined aspects of all of our cards. When we had done this, we had to pin our cards onto the wall to share our ideas with everyone.
Activity 2: Forming research teams
After a short break, everyone was asked to form a line based on how much research experience everyone has. On one side, there were all the new PhD students, and on the other side, there were the professors and heads of existing research groups.
Approximately 6 PhD students and early researchers were asked to review the cards that had been generated from the speed dating activity, and each had to choose a card they found most interesting. This card (represented by one of the researchers) would then form the basis of a new team of 3 or 4 researchers.
One at a time, the rest of the researchers were ushered over to speak with the new researchers. If you liked an idea, and there wasn’t already 3 or 4 researchers, you could join a team. The longer the game went on, the harder it becomes for the more experienced researchers. Instead, they would have to make use of all the powers of persuasion to try to join an existing team, or to persuade fellow researchers to create new teams.
After some discussion and reviewing cards, I joined two of my colleagues, Dhouha Kbaier and Yaw Buadu. Two project cards were combined together to create a new project. Paraphrasing our cards, our project intended to:
Develop digital technologies to enhance engagement and participation by integrating more physical computing into the computing curriculum.
Accompanying research questions were: what are the challenges of using physical computing in a distance learning environment, and how might physical computing devices be connected to and integrated within the Open STEM labs?
This final question suggests the opportunity to explore costs and trade-offs of a physical computing approach where students use their own equipment, or share equipment with other students through a platform which is accessed remotely.
What might physical computing actually mean? One answer to this is: physical hardware used by students to learn about or to solve computing problems, as opposed to using software simulations. There is a precedent of using (and sharing) physical computing devices at the university. In earlier decades, there was the Hektor computer (computinghistory.org.uk), which was once sent out to computing students (and then later returned to the university).
A more modern and smaller (and much more sophisticated) version is the Raspberry Pi computer (Raspberry Pi website) which can be used with any number of interesting computing projects.
One other aspect that we discussed were about the stakeholders, and who might need to be involved? We identified the following groups: students, tutors, module team members, and administrative university functions. (The members of module team may include both tutors and curriculum managers, who act as a fundamental link between the academic team and operations of the university bureaucracy).
Impact: evaluation and presentation
The next bit of the fiesta was a presentation; a double act from two colleagues from the research school, Betul Khalil, an Impact Manager, and Gareth Davies, who is a Research Evidence Impact Manager.
They began with a question: what is impact, and can we give an example?
Impact isn’t the same as project outcomes. They are very different things. An outcome might be a report, or some software. An impact can refer to a change that may have led to a positive long term benefit to stakeholders. In terms of the UK Research Excellent Framework (REF Impact case studies), impact could mean a change to society, the economy, and to the natural environment. Also, a measurable change might be on a local, regional or international scale.
The message to us was clear: when working on a project bid, researchers need to proactively consider impact from the outset and define impact objectives, since gathering effective evidence to show how those objectives may have been met takes time. In some respects, impact evidence gathering is a further part of the research process. To do it well, researchers need an impact plan to accompany a research or project plan.
We were all given a handout, from which I have noted down some useful questions that researchers need to bear in mind. These are:
Who are the stakeholders, and who might be affected by the change your project may facilitate?
What do the stakeholders (or beneficiaries) gain from your research?
Why will they engage with your research?
How will you communicate with beneficiaries?
What activities might you need to run to effect change?
How might you evidence change?
How will you connect change to your research?
Later, Gareth talked more about what it means to ‘evidence’ impact. An important note I made from Gareth’s presentation was that “upsteam planning is important” and that the analysis of impact should be rigorous. Researchers also need to consider which methods they use to enable them to find a way to observe what is changing.
Apparently, one of the most common forms of evidence is a written testimonial (in the form of a testimonial letter). Within this assertion lies the reflection that researchers need to make sure they have the time and the means to gather evidence.
Activity 3: How will we do our project?
Our next activity was to sit around a table to figure out how were going to do to answer our research questions.
We began by asking: what might the outputs from our project be? We came up with some rough answers, which were:
Guidelines about how physical computing could be embedded and used within module teams. If used within a module, tutors could then be offered some accompanying guidance.
Recommendations about physical kit that could be used (these kits might be bought, or borrowed, or used from a distance); recommendations about the use of software; recommendations about pedagogy and use (which is an idea that can relate to the idea of useful guidelines).
To produce these, what needs to be done? Our team offered the following suggestions (but the exact order of carrying these out could be easily debated):
Examine learning outcomes within various qualifications and accompanying modules.
Explore the problem space running focus groups with stakeholders to understand how the terms engagement and participation are understood.
Use mixed methods: from the focus group results, carry out a survey to more thoroughly understand how a wider population understands engagement and participation.
From these different information sources (and input from the learning outcomes) facilitate a number of curriculum design workshops to understand how physical computing can be brought into the curriculum.
Carry out a detailed analysis of all the data that has been captured, writing up all the findings.
Implement the findings.
A further reflection was each of these activities needs to be considered in terms of SMART objectives; specific, measurable, achievable, realistic and timebound.
A new question that we were asked was: what impact will your project have?
Given that students are key stakeholders, there might be broader impacts in terms of results to the National Students Survey. There might be further impacts both within the university, and to other organisations that provide distance learning. There might also be impacts that could be broadly described as the further development of computing pedagogy. This is all very well, but how might we go about measuring all this? It is this question which the facilitators from the research school may have wanted to encourage us to consider.
What happens next?
After presenting our plan to all the other groups, we were asked a couple of final questions, which were: how excited are you about the project? Also, how doable (or realistic) is the project?
Given that we all have our own main research interests (which are slightly different to the new project that we have defined), we all had different levels of enthusiasm about going ahead with this project idea. That said, the key concepts of physical computing (in its broadest sense) and student engagement are important topics which other researchers may well be interested in exploring. Even if this particular team may not be in a position to take these ideas forward, the ideas are still worth exploring and studying.
Reflections
I really liked the way that we were asked to focus on trying to get things done.
When thinking about research (and research projects) impact has always been something that has always been at the back of my mind, but I’ve always tended to consider it as something that is quite intangible and difficult to measure. The presenters from the research school made a really clear point. They emphasised that it is important to plan for impact before your project has started.
A personal reflection is that impact could be thought of as a way to reflect on the success of a project. In some respects, this should be something that researchers should be doing as a matter of course to further develop their professional skills. Of course, the extent and nature of this analysis will depend very much on the nature of the research that is carried out through a project. Given the collaborative nature of research, gathering of impact evidence is likely to be collaborative too.
It is interesting to compare this Research Fiesta with the one that was held in 2019. One of the differences being that there were a lot fewer people attending this event. This might have been a factor due to the timing (some new module presentations were just about to begin) or a hangover from the 2019-20 pandemic (where so many colleagues switched to homeworking).
An interesting difference related to the structure: this event was facilitated in a dynamic way, where the research themes emerged from the participants. The earlier event had more emphasis on sharing information about the research groups within the school, and more of the practicalities about how to gain funding for research. There is, of course, no right or wrong way to run a research fiesta. I appreciated the dynamic structure, but equally I’m always up for hearing about new concepts and ideas, and learning about what is happening within and across the school.
Acknowledgements
Many thanks to Amel Bennaceur for organising the event. One of the impacts has been to get to catch up with colleagues, and to learn more about them! It was a pleasure working with my fellow group members, Dhouha Kbaier and Yaw Buadu who kindly reviewed this blog article before it was published.
Computing and Communications: 2023 Research Fiesta
On 25 January 2022 I attended an event called the School of Computing and Communications Research Fiesta, which took place on the university campus. One of my reasons for attending the fiesta was to try to restart my research activities, having stepped away from research due to taking on a role called ‘lead staff tutor’ for the last three years.
The last time I attended a school research fiesta was on 10 January 2019 (OU blog) which took place at the nearby Kents Hill conference centre. Following this earlier event, I shared an accompanying post about research funding (OU blog).
This event was advertised as a “… time for us to reconvene and discuss everything research. This event is aimed to help us (re-)connect with one another and understand how we can help and benefit from each other’s research expertise and outputs” and was facilitated by David Bush from Ascolto.
What follows is a summary of the Research Fiesta, in terms of what happened during the meeting, and what I felt the biggest take away points were. This blog may of be interest to anyone who was at the event, couldn’t make it to the event, or broadly interested in the process of research (whether computing research, or research that takes place within other disciplines).
Preparation
Before the event, we were asked to prepare some cards which summarised our research interests. Although I didn’t write the card in advance, I did come to the event with some ideas in mind. Here’s what I wrote down on three cards:
Later during the session, I added two more cards:
This final idea emerged from discussions with tutors, following some discussions with tutors, and might form the basis of a scholarship project. The university has prepared a lot of materials about writing; the question is whether the computing programme makes effective use of them, given the writing requirements from some courses.
Activity 1: Sharing research ideas
Our first activity could be loosely called “academic speed dating”.
I’ve done this before (both the academic version, and the non-academic version).
In this version, we were sent to various tables, where we met up with two other colleagues. Our task was to show our cards (our research ideas) and try to create a new card that combined aspects of all of our cards. When we had done this, we had to pin our cards onto the wall to share our ideas with everyone.
Activity 2: Forming research teams
After a short break, everyone was asked to form a line based on how much research experience everyone has. On one side, there were all the new PhD students, and on the other side, there were the professors and heads of existing research groups.
Approximately 6 PhD students and early researchers were asked to review the cards that had been generated from the speed dating activity, and each had to choose a card they found most interesting. This card (represented by one of the researchers) would then form the basis of a new team of 3 or 4 researchers.
One at a time, the rest of the researchers were ushered over to speak with the new researchers. If you liked an idea, and there wasn’t already 3 or 4 researchers, you could join a team. The longer the game went on, the harder it becomes for the more experienced researchers. Instead, they would have to make use of all the powers of persuasion to try to join an existing team, or to persuade fellow researchers to create new teams.
After some discussion and reviewing cards, I joined two of my colleagues, Dhouha Kbaier and Yaw Buadu. Two project cards were combined together to create a new project. Paraphrasing our cards, our project intended to:
Develop digital technologies to enhance engagement and participation by integrating more physical computing into the computing curriculum.
Accompanying research questions were: what are the challenges of using physical computing in a distance learning environment, and how might physical computing devices be connected to and integrated within the Open STEM labs?
This final question suggests the opportunity to explore costs and trade-offs of a physical computing approach where students use their own equipment, or share equipment with other students through a platform which is accessed remotely.
What might physical computing actually mean? One answer to this is: physical hardware used by students to learn about or to solve computing problems, as opposed to using software simulations. There is a precedent of using (and sharing) physical computing devices at the university. In earlier decades, there was the Hektor computer (computinghistory.org.uk), which was once sent out to computing students (and then later returned to the university).
A more modern and smaller (and much more sophisticated) version is the Raspberry Pi computer (Raspberry Pi website) which can be used with any number of interesting computing projects.
One other aspect that we discussed were about the stakeholders, and who might need to be involved? We identified the following groups: students, tutors, module team members, and administrative university functions. (The members of module team may include both tutors and curriculum managers, who act as a fundamental link between the academic team and operations of the university bureaucracy).
Impact: evaluation and presentation
The next bit of the fiesta was a presentation; a double act from two colleagues from the research school, Betul Khalil, an Impact Manager, and Gareth Davies, who is a Research Evidence Impact Manager.
They began with a question: what is impact, and can we give an example?
Impact isn’t the same as project outcomes. They are very different things. An outcome might be a report, or some software. An impact can refer to a change that may have led to a positive long term benefit to stakeholders. In terms of the UK Research Excellent Framework (REF Impact case studies), impact could mean a change to society, the economy, and to the natural environment. Also, a measurable change might be on a local, regional or international scale.
The message to us was clear: when working on a project bid, researchers need to proactively consider impact from the outset and define impact objectives, since gathering effective evidence to show how those objectives may have been met takes time. In some respects, impact evidence gathering is a further part of the research process. To do it well, researchers need an impact plan to accompany a research or project plan.
We were all given a handout, from which I have noted down some useful questions that researchers need to bear in mind. These are:
Later, Gareth talked more about what it means to ‘evidence’ impact. An important note I made from Gareth’s presentation was that “upsteam planning is important” and that the analysis of impact should be rigorous. Researchers also need to consider which methods they use to enable them to find a way to observe what is changing.
Apparently, one of the most common forms of evidence is a written testimonial (in the form of a testimonial letter). Within this assertion lies the reflection that researchers need to make sure they have the time and the means to gather evidence.
Activity 3: How will we do our project?
Our next activity was to sit around a table to figure out how were going to do to answer our research questions.
We began by asking: what might the outputs from our project be? We came up with some rough answers, which were:
To produce these, what needs to be done? Our team offered the following suggestions (but the exact order of carrying these out could be easily debated):
A further reflection was each of these activities needs to be considered in terms of SMART objectives; specific, measurable, achievable, realistic and timebound.
A new question that we were asked was: what impact will your project have?
Given that students are key stakeholders, there might be broader impacts in terms of results to the National Students Survey. There might be further impacts both within the university, and to other organisations that provide distance learning. There might also be impacts that could be broadly described as the further development of computing pedagogy. This is all very well, but how might we go about measuring all this? It is this question which the facilitators from the research school may have wanted to encourage us to consider.
What happens next?
After presenting our plan to all the other groups, we were asked a couple of final questions, which were: how excited are you about the project? Also, how doable (or realistic) is the project?
Given that we all have our own main research interests (which are slightly different to the new project that we have defined), we all had different levels of enthusiasm about going ahead with this project idea. That said, the key concepts of physical computing (in its broadest sense) and student engagement are important topics which other researchers may well be interested in exploring. Even if this particular team may not be in a position to take these ideas forward, the ideas are still worth exploring and studying.
Reflections
I really liked the way that we were asked to focus on trying to get things done.
When thinking about research (and research projects) impact has always been something that has always been at the back of my mind, but I’ve always tended to consider it as something that is quite intangible and difficult to measure. The presenters from the research school made a really clear point. They emphasised that it is important to plan for impact before your project has started.
A personal reflection is that impact could be thought of as a way to reflect on the success of a project. In some respects, this should be something that researchers should be doing as a matter of course to further develop their professional skills. Of course, the extent and nature of this analysis will depend very much on the nature of the research that is carried out through a project. Given the collaborative nature of research, gathering of impact evidence is likely to be collaborative too.
It is interesting to compare this Research Fiesta with the one that was held in 2019. One of the differences being that there were a lot fewer people attending this event. This might have been a factor due to the timing (some new module presentations were just about to begin) or a hangover from the 2019-20 pandemic (where so many colleagues switched to homeworking).
An interesting difference related to the structure: this event was facilitated in a dynamic way, where the research themes emerged from the participants. The earlier event had more emphasis on sharing information about the research groups within the school, and more of the practicalities about how to gain funding for research. There is, of course, no right or wrong way to run a research fiesta. I appreciated the dynamic structure, but equally I’m always up for hearing about new concepts and ideas, and learning about what is happening within and across the school.
Acknowledgements
Many thanks to Amel Bennaceur for organising the event. One of the impacts has been to get to catch up with colleagues, and to learn more about them! It was a pleasure working with my fellow group members, Dhouha Kbaier and Yaw Buadu who kindly reviewed this blog article before it was published.