Fig. 1 New Scientist 9 February 2013 Mind Maths by Colin Barras
And is visualised in many ways, Engestrom (2007) Mycorrhizae thinks in term of fungi.
My own take is a lichen:
The language you use carries with it connotations and hidden assumptions. You need to make things as clear and as explicit as possible to develop shared meaning and understanding to avoid confusion. Conole (2011:404) Indeed. Conole in one sentence manages several metaphors:
· Different lenses
· Digital landscape
· Navigate through this space
So we've go camera lenses/how the eye sees, we have a landscape that has a physical presence, where a digital one does not and then we have an image of a Tall Ship on an ocean passing through this landscape (or at least I do). You might see a GPS device, a map and compass on a the Yorkshire Fells. Language creates images in our minds eye. The danger of a metaphor is when it creates parameters or absolutes.
I find it problematic that descpite the tools around us we are obliged to communicate with words. We could use images, we can use live audio, but we are yet to construct and respond to these activites with a piece to webcam.
Conole and Oliver mention four levels of description:
1. Flat vocabulary
2. More complex vocabulary
3. Classification schemas or models
Which is the most persuasive? The most effective and memorable?
This set of words is used to describe cloudworks. Only the last stands out as pertinent to Web 2.0 and the kinds of apt terms for e-learning 2011.
Link to site
Request for advice
Metaphors are indeed 'powerful ways of meaning making'. (Conole. 2011.406)
Ref: Metaphors we live by. Lakoff and Johnson (1980)
Over the last 18 months I have returned repeatedly to the importance and value of metaphors, drawing on neuroscience and literature. There are 28 entries in which metaphor is discussed. This is perhaps the most insightful as it draws on an article in the New Scientist.
Morgan’s Metaphors discussed by Conole, White and Oliver (2007)
5. Political systems
Whatever works for you, but importantly, what you can use that is comprehended by others.
Presenting on Social Media over the last few weeks I have repeatedly used images of the Solar System to develop ideas of gravity and magnitude, spheres of influence and impacts. It is one way to try and make sense of it. The other one I use is the water-cycle, but as that can turn into an A' Level geography class.
Some futher thoughts from Conole
‘These and other tools are beginning to enable us to embed more meaning in the objects and connections within the digital space. The tools can also be used to navigate through the digital space, providing particular narrative paths of meaning to address different goals or interests.’ (Conole, 2011:409)
‘The approach needs to shift to harnessing the networked aspects of new technologies, so that individuals foster their own set of meaningful connections to support their practice, whether this be teachers and seeking connections to support them in developing and delivering their teaching, or learners in search of connections to support and evidence of their learning. (Conole. 2011:410)
‘Those not engaging with technologies or without access are getting left further and further behind. We need to be mindful that the egalitarian, liberal view of new technologies is a myth; power and dynamics remain, niches develop and evolve. Applications of metaphorical notions of ecology, culture and politics can help us better understand and deal with these complexities. (Conole. 2011:410)
How do we describe and make sense of digital environments?
It is complex and multifaceted
This may be a good or bad thing, depending on what is keeping you busy.
I need to find the perceptive Stephen Appleby cartoon that expresses this very well. It shows a guy riding an escalator which represents life and getting older; this character moans about life going too fast and immediately the escalator turns into a ramp. Of course, faced with this greater struggle our character bemoans his lot even more vosciferously.
(I liked it so much I cut it out and put in a portfolio - the physical kind. Today I would photograph and upload ... I'd digitised it).
I think this New Scientist article is saying take up Kite-surfing or rock-climbing.
Or gymnastics for the mind.
I feel for one year doing an MA course with the OU I have experienced three.
'People with busy lives don't necessarily live longer, but they might feel as if they do.'
All this is from a New Scientist news story 29 Jan 2011.
So how does someone gaoled for 25 years feel?
'Our brains use the world around us to keep track of time, and the more there is going on, the slower time feels.'
I'd hardly say I felt that time was grinding to a halt, but this last 12 months, with the OU MAODE in the vanguard, I've packed in a good deal. It's starting to feel like 'Groundhog Day' at the point where Bill Murray (Phil Connors) has gone positive.
People with busy lives are happier, so long as the degree of business is something that they control.
“If you want to get something done, give it to a busy person.”
Find out more in 'Current Biology'
I relish phrases such as 'adaptive use of stochastically evolving dynamic stimuli' and 'a process of Bayesian inference based on expectations of change in the natural environment.' These phrases are food for the brain, like eating gizzards for the first timne, as I recall doing at the Auberge Les Allouettes age 15. I had this habit of always trying something I'd had not tried before; this I never attempted again. I did take to steak tartar
Neuroscience interests me; my steak tartare.
Which is another way, I am sure, to stretch time - keep a diary, blog, do an OU course, and so live in the past, present and the future.
So I blogged three months ago when considering the merits and demerits of keeping a learning journal and reflective writing.
It transpires that sleep really does sort the ‘memory wheat from the chaff’ according to a report in the Journal of Neuroscience, DOI, 10,1,1523.jneuorsci.3575-10.2011) referred to in the current New Scientist. This Week. 5 FEB 2011.
‘It turns out that during sleep the brain specifically preserves nuggets of thought it previously tagged as important.’ Ferris Jabr says.
I have always used sleep to reflect on ideas.
If I expect or wish to actively dwell on something I will go to sleep with the final thought on my mind, a pen and pad of paper by my side. Cat naps are good for this too. I will position myself with pillows and a book, or article and drift off as I finish. Waking up ten or twenty minutes later I glance straight back at the page and will feel a greater connection with it.
I wonder if there is commercial value in working from home and doing so up 'til the point you need to fall asleep? It's how my wife works when she is compiling a hefty report. It's how I work when I have an assignment, or a script to deliver ... or a producton to complete. The work never stops and it doesn't stop me sleeping.
Going back to tagging.
How does the mind do this? In curious ways. We all know how a memory can be tagged with a smell or a sound. For me how mothballs remind me of my Granny’s cupboard (an image of it immediately in my mind). A Kenwood blender will always remind me of my mother grings biscuits to put on the basae of a cheesecake. And a sherbert dip the Caravan Shop, Beadnell, Northumberland. Often when a random recollection enters my consciousness I try to think what has triggered it: the way the light falls on a tree, the exhaust from a car or even a slight discomfort in my stomach. It is random. Indeed, is a random thought not impossible?
There has to be a trigger, surely?
Can any of these be used?
Perhaps I could categorise content here, or in an eportfolio by taste. So chocolate digestive biscuits might be used to recall anecdotes. Toothpaste might be used to recall statistics. Varieties of Bassett’s Liquorice Allsorts might be associated with people I have got to know (a bit) during the MAODE.
The mind boggles; or at least mine does.
Colour and images (Still or moving) is as much as we can do so far.
I’m intrigued by memory games. I like the journey around a familiar setting where you place objects you need to remember in familiar places so that you can recall a list of things. Here the tag is somewhere familiar juxtaposed with the fresh information.
Are there better ways to tag?
Look at my ridiculously long list of tags here. Am I being obtuse? When I think of a tag do I come up with a word I've not yet used? How conducive is that to recalling this entry, or grouping similar entries to do the job?
I like the way some blogs (Wordpress/EduBlogs) prompt you to use a tag you’ve applied before; it offers some order to it all. I long ago lost track of the 17000 entries in my blog. Would I want to categorise them all anyhow? I think I managed 37. I prefer the 'enter@random' button I installed.
Going back to this idea of tagging by taste/smell, might a word (the category) be given division by taste/smell, texture and colour? How though would such categories work in a digital form? Am all I doing here recreating a person’s shed, stuff shoved under their bed or stacked in a garage, or put in a trunk or tuck box in the attic?
In the test reported in the Neuroscientist those who went to bed in the knowledge that they would be tested on the information they had looked at that day had a 12% better recall.
It doesn’t happen in MAODE, if at all. When are we put on the spot? When are we expected ever to playback a definition under ‘duress’?
‘There is an active memory process during sleep that selects certain memories and puts them in long-term storage.’
Like an e-portfolio?
Is the amount of sleep I've had, the 350 or so nights since I started the MAODE ... part of the learning environment required?
Sleep Selectively Enhances Memory Expected to Be of Future Relevance
Wilhelm et al. J. Neurosci..2011; 31: 1563-1569
I'm not tired, which is the worry; it'll catch up with me. When I wake up with a clear, original thought I've learnt to run with it. Time was I could have put on a light, scribbled a bit then drifted off again. 17 years of marriage (and 20 years together) I've learnt to get up. And once I'm up, then I know it'll be a while before I can sleep again.
(I'll sleep on the train into London; at least I can't overshoot. I once got on the train at Oxford on the way into town and woke up in Cardiff).
I have the thought nailed, or rather sketched out, literally, with a Faber-Castell Artist Pen onto an A5 sheet of cartridge paper in Derwent hardback sketch book. This seems like a waste of good paper (and a good pen), but this doodle, more of a diagram, almost says it all. My vision, my argument, my persuasive thought. My revolution?
Almost enough, because I then show how I'll animate my expression of this idea by drawing it out in a storyboard. I can do it in seven images (I thought it would take more). I hear myself presenting this without needing to do so, though, believing myself quite capable of forgetting this entire episode I'll write it out too.
I once though of myself as an innovator, even an entrepreneur. I had some modest success too. Enough to think such ideas could make me. I realise at this moment that such ideas are the product of intense mental stimulation. To say that H808 has been stimulating would be to under value how it has tickled my synapses. The last time I felt I didn't need to sleep I was an undergraduate; I won't make that mistake. We bodies have needs. So, to write, then to bed.
(This undergraduate thing though, or graduate as I now am ... however mature. There has to be something about the culture and context of studying that tips certain people into this mode).
You may get the full, animated, voice over podcast of the thing later in the week. I'll create the animation myself using a magic drawing tool called ArtPad and do so using a stylus onto a Wacom board.
(Never before, using a plastic stylus on an a plastic ice-rink of a tablet have I had the sensation that I am using a drawing or painting tool using real ink or paint. I can't wait 'til I can afford an A3 sized Wacom board ... drawing comes from the shoulder, not the wrist and certainly not the finger tips. You need scale. Which reminds me, where is the book I have on Quentin Blake?)
Now where's a Venture Capitalist when you need one at 04.07am. That and a plumber, the contents of the upstairs bathroom (loo, bath and sink) are flooding out underneath the downstairs loo. Pleasant. A venture capitalist who is a plumber. Now there's something I doubt that can even be found if you search in Ga-Ga Googleland.
Not even a Twit.
I'd like to blog extensively about Quantum Evolution. Perhaps once the H808 ECA is done.
For now read more in the New Scientist. 8 January 20011.
Wherein lies the advantage of a Twitter - a little, and often to build readership.
1. Are we hard-wired to how we conceptualise ideas?
2. Does this help or hinder the way we use eLearning tools?
3. Will children, say, 50 years from now, look at paper and pen in the same way as a person does now when they take a first look at computer?
4. Are we at some 'transition' point, and if we are, what does this mean?
My tutor in H808 asked me this on 12th September.
I feel far better able to reply now after four months of H808 and some fortuitous reading, though I did respond at the time. My forum thread exchange then and reflection on it today will form part of my ECA.
It surprises me that I have subscribed to a magazine at all, but I find the New Scientist offers plenty on our e-world upon which to reflect and insights to all kinds of other things that tickle my brain.
It matters that you read broadly.
The French Film Director Francois Truffaut was a firm believer of reading everything and anything that caught your attention. He’d have loved the web. It matters that you follow what the web offers, then browse the shelves for magazines at the newsagent on the forecourt of your station.
My favourite button that has been crucial to the longevity of my blog (elsewhere) for the last seven years is ‘Enter@Random.’
We don’t think in chronological order.
thinking is a mess, it selects ideas and makes things up sing different sides and corners and crooks and crannies of our brains. I unplugged the calendar on my diary in year one and replaced it with 12 themes that have now grown to 37. For a period there were 37 blogs, but try managing that, to say you end up with a split personality is an understatement.
My tutor put it to me (and us) the H808 Tutor Group:
1. Are we hard-wired to how we conceptualise ideas?
Dr Vilayanur S Ramachandran thinks so. We have a unique capacity to think in metaphors. This matters. It is this ability that makes us creative, allows us to be inventive, it is what makes us human beings.
Read all about in the New Scientist.
Quoted here within the 200 word count permission for a student quote.
Added as for student reading in a non-commercial academic context having read the copyright permissions.
Ramachandran is particularly interested in metaphor because it ties in neatly with his previous work on synaesthesia - a kind of sensory hijack, where, for example, people see numbers as colours or taste words. "Metaphor is our ability to link seemingly unrelated ideas, just like synaesthesia links the senses," he says.
After spending years working with people who have synaesthesia, he believes "pruning genes" are responsible. In the fetal brain, all parts of the brain are interconnected, but as we age, the connections are pruned. If the pruning genes get it wrong, the connections are off. "If you think of ideas as being enshrined in neural populations in the brain, if you get greater cross-connectivity you're going to create a propensity towards metaphorical thinking," he says.
I don't have synaesthesia, neither does Ramachandran, but he points out to me the strangeness of asking why, say, the cheddar cheese in your sandwich is "sharp". It's true, cheese isn't sharp, it's soft, so why do I use a tactile adjective to describe a gustatory sensation? "It means our brains are already replete with synaesthetic metaphors," he says. "Your loud shirt isn't making any noise, it's because the same genes that can predispose you to synaesthesia also predispose you to make links between seemingly unrelated ideas, which is the basis of creativity."
Of the 12 photographs in this issue as many as 8, I think, are from the Getty Image bank. I wonder if one day, especially if I’m reading this on an iPad the images will move, rather as the paints are alive in the background of a Harry Potter movie. It wouldn’t take much for a photography to video as well as, or instead of taking a photograph. Indeed, the BBC now permit directors to generate HD TV footage using digital SLR cameras … the lenses are better, the creative choices wider.
2. Does this help or hinder the way we use eLearning tools?
How we use the web, let alone e-learning tools is in its infancy. We are still putting old ways online, still making web-pages into slide shows and calling them immersive learning. Gaming may change this, with the budget. Better, faster tools will enabled more. Collaboration on world wide wikis with like minds, and great minds, contributing will speed up the rate of change.
We’ll think in the same metaphors though, share and reinforce new metaphors and then some Leonardo da Vinci of the 21st century will come along and break it apart. Though we may not appreciate their insights at all.
Mobile learning, smart-phone learning on the move, or whatever you want to call it should shake things up. At first this will be, and is, the same old stuff sent to your phone, basic card to card Q&A even if it includes a bit if video or an animated graph.
I want learning projected onto the back of my scull, I want it in my head, not online or in a device. I want interactions with specific parts of my brain. I want my brain duplicated so that I can take more lessons at the same time, to learn multiple languages and to take several degrees simultaneously.
3. Will children, say, 50 years from now, look at paper and pen in the same way as a person does now when they take a first look at computer?
It is extraordinary the relationship between our minds and out limbs, or arms and finger tips. With training we can sight read a score and play complex musical pieces, we can scroll, cut, edit, fly and colourise images into a piece of drama that has us crying, or heads in our hands and we can type, like the clappers.
We can draw too, and sculpt, and swim and dance and do gymnastics.
Our relationship with the nerves in our body is a complex one. As for handwriting, our relationship with fountain pens, marker pens and pencils? It ought to be a skill still taught at school, there need to be handwriting competitions as there once were … even if they are tied into art classes and design.
How different is a stylus on a tablet to a piece of chalk on a slate?
I implore my children to write and draw. An illegible Christmas list is no list at all. They’d type, they do type. Yet how backwards is a QWERTY keyboard?
4. Are we at some 'transition' point, and if we are, what does this mean?
Yes. And I mean to be part of it.
We have reached the Tipping Point.
A book a read if I recall in 2001 when we thought we were approaching a tipping point, actually we were reaching the point at which the first e-bubble would burst. First and last? These things go in cycles, whatever the politicians do to stymie human nature. Greed and regret, progress, reflection, reinvention … then we do it all over.
We’re not even less violent than we were at the times of the Viking raids.
Meandering? A stream of consciousness? Reflection? Regurgitation?
All of this, and it all matters. You don’t have to read it, and you probably haven’t. This is here for me to find when I need it in seven months or seven years time.
It is remarkable how your views change; so it matters to have what you originally thought in front of you. There are memories I have that haven’t just been reworked over the decades, but have become different events. This isn’t simply age, though that has much to do with it, I view what I did as a child or teenager as I observe my own children today, the difference is, I can’t influence the behaviour and actions of my younger self, though I can, I hope listen to and guide my own children to actions and decisions they will feel comfortable with in the years to come
Thomson, H (2010) V. S. Ramachandran: Mind, metaphor and mirror neurons 10 January 2011 by Helen Thomson Magazine issue 2794.
Then you settle into married life and children and, as I now do, I celebrate my 18th Wedding anniversary, my younger sister's 25th and the 50th anniversary of my in-laws.I read about people who plan to digitise their life. The ephemera I have includes the diaries and a trunk of handwritten letters; rememeber them? And letters this boy sent to his Mum from about the age of 8.
Wherein lies the value of it? A useful habit, as it turns out, but do we expect our want a new generation to store every text, every message, every Facebook entry. Are these not stored whether they like it or not ... and potentially shared. Whose business should it be, when and if to 'disclose' or 'expose' a life. It can be of value, but it can also be harmful.
On the reverse side of this card is a note to my fiance, written on the 17th February 1992. We'd been engaged for 8 months, were living apart and would be together that summer and remain together now.
The value of reflection here, is a reminder of these sentiments. The value of any record, any stirred memory, can be to reinforce it, to be cherished, forgotten or dealt with. But if you haven't taken notes, you rely on the vagaries of your mind. So perhaps a massively scaled down version of digitising everything you do may have value, like a broach you press on occassion 'for the record.
All of this STILL coming from a single Opinion piece in the New Scientist (23 December to 1 Jan) about someone digitising every moment of their existence.
This is how the 'professional' student or corporate blog should look ... not social networking, no flirting, no personal stuff, just the business - something to chew on.
Rather than feeling that I am entering the blog domain to write this I ought to be able to cyndicate/allocate or aggregate this as or after I have wrote it by clicking on one of three buttons:
At my behest I am therefore deciding that this is a moment to be shared (but not tampered with), evidence or information that I wish to store/collate (ideally by themes of my choosing), and/or a chunk of information (or offering) as wiki content (initiated or an edit insert).
Simplified and disengenious, but a starting point.
And on reflection, perhaps, how good learning works: it starts with simple ideas that can be grasped and works outwards. E-learning doesn't simply work outwards though, it spreads in directions of the learner's choosing (ideally), like fractals, like a mind-map, as a result of, enabled and speeded up through myelination.
Were I writing a video script on eportfolios, wiki and blogs this might be how I begin, either animating this or going out and filming various traffic lights. I may paint this with water-paints onto laminate card and drop it into an aquarium and film it. My enduring analogy being that whatever we do online are but zeros and ones in a digital ocean, all programming does is remove the chaos and worthlessness of trillions of unconnected binary numbers.
Perhaps I've just convinced myself too of the value of Open Source.
And this is only the first idea of the morning. Something must have been breing in my sleep.
Though yet to do justice here to the Opinion piece in the New Scientist something struck me about the Cover Story on epigentic changes and their relevance to evolution.
Q. What is Myelin?
|From Drop Box|
I've ignored ideas in the past and regretted it.
I recall a lunch with a Cambridge Graduate who had created software that made texting possible. His company was looking at ways to expand its use on mobile phones. All I could think was that it was a retrograde step and would take us back to pagers; remember them? How wrong I was.
I recall also reading about someone who had kept a 'web-blog' (sic) and photo journal of their business year in 1998.
Even as a diarist and blogger I thought this somewhat obsessive. From research into the patterns and networks created 'LinkedIn' emerged.
So when a Microsoft programmer Gordon Bell decides to make a digital record of everything they do to see what patterns may emerge THIS time I take an interest. (New Scientist, Opinion. 23 December 2010 / January 2011
My immediate thought, not least because I lack the resources, is to be highly selective. Had I a team to take content, edit, transcribe, edit, collate and link, maybe I'd do more; I don't.
A professional swimming coach should assess and reflect on the sessions they deliver. I did this without fail for nearly three years, by which time a good deal of it was repetitive and I felt comfortable with the many different plans I was delivering to different groups. I've been videod, I use video to analyse strokes and skills and I use a digital recorder to jot down observations of swimmers. So what if I leave the digital recorder open for a session. Am I prepared to run through this hour for a start? If I do so what might I learn?
That this is a valid form of evidence of my abilities (and weaknesses)
That edited (no names revealed of swimmers) it, especially parts of it, become a training tool (best practice) or simply insights for others on how, in this instance, a one hour session is transmogrified for use with different levels/standards/age of swimmer.
I video lectures in 1983 on Sony Betamax. I did plays. Debates. All kinds of campus activity around Oxford. I learnt a good deal. The camera is not your mind's eye, this is why you edit and develop craft skills, not because you want to dramatise reality, but because the mind does it for you. We don't go round with fish-eyes taking everything in, we do jump between a wide, mid and close shot. And when we concentrate on something the proverbial naked woman could walk down the street and you wouldn't notice. A camera around one's neck cannot and will not establish or adjust to any of these view points.
The act of recording changes your behaviour, it is therefore a record of a false behaviour.
I filled some of the gaps. I set down some of my thoughts on how swimmers were performing whereas usually I'd make a 'mental note' or jot something down on paper.
Shortcuts will be uncovered, valuable algorithms will be written. Might, for example, the old corporate audit of how people spend their time be transformed if, putting it at arm's length, the function is monitored during a working day?
We've seen from the reality TV show 'Seven Days' shot in Notting Hill how tedious the lives of Jo blogs can be as entertainment. We're tired of Big Brother too. As Bell remarks, 'most of the moments he records are mind-numbingly dull, trite, predictable, tedious and prosaic.'
To deliver further the New Scientist advices that we take a look at:
Use of email
Track Your Happiness
Your Flowing Data
Why a handwritten diary my be better not only that digitising everything, but even a blog?
The way you write reflects your mood, captures tone, even levels of intoxication, passion or aggitation, as well as your age. Though I fear the work of the graphologist is redundant. A choice is made over the writing implement, and the book or pages in which it is expressed. You make choices. If you must, you can have bullet points of events. It doesn't take much of a tickle for the mind to remember an exact moment. Such moments digitised are two dimensional, with no perspective. A memory recalle matures, its meaning changes as does your interest in it. A memory loved and cherished is very different to one that you wish to forget. What happens when both haunt you in their digital form? And when such memories become everybody's property?
Where does copyright stand if you are digitising life?
We watch TV, we read books, we play video games, we read letters and bank statements, we have conversations that are meant to be private ...
Meanwhile, I've barely dealt with the fall out of this Opinion piece in the New Scientist and the next issue looking at neuroscience does my head.
Here Vilayanur S Ramachandran gets his head around the importance of metaphor in creativity and how it separates us from all other beings. I used to cheat
|From Drop Box|
It took you out of your own mind and messed it up; sometimes useful, sometimes not. The way to be creative is to develop an inquiring, critical, educated, multi-outletting, messed up mind. Sing, dance, draw, paint, play musical instruments, climb trees, exercise in crazy ways, every week do or say something you've never done or said before. 'Quantable.' (sic) Radio 4. 6th Jan 2011. 20h30. The context was using the process of counting numbers to quanitfy some excess and the interviewee used this term 'quantable' which the producer of the programme must have liked because it was repeated. Amazing how we can mash-up the English Language and the new word may make perfect sense. Where ams I? ECA and a job interview. So what am I doing here? Habit. I want to come back to these ideas later and by doing this I know where it is.
I'll have had 3,000 words out of the New Scientist article, 'Dear, E-Diary' before I'm finished. (New Scientist 23 December / 1 January).
I can think of little else, how pointless it would be to record all that you do and see and hear all day. And then, taking a swimming group this morning, armed with a digital recorder and headset I wondered if recording my instructions and tips to the swimmers over 2 hours +, if done every week for a few months, at least following through all the strokes, progressions and skills, if this could become the basis of a podcast series reduced to 4-6 mins each? The kids would initially say something about the headset and mouthpiece ... and probably offer some commentary, most of which I could now lift out having mastered WavePad.
There's a piece in the New Scientist this last fortnight about the merits of not only keeping a diary, but digitising every moment of your waking life. The piece opens with the suggestion that at this time of year everyone is buying a diary; they're not. Most of us buy a diary in August as an academic diary if you have anything to do with educaiton (student, teacher, academic or parent) is a much more logical thing to have to carry you through the shool year. In any case, who cares where you start your diary if its digital or even a Filofax insert.
There's criticism of the Five Year Diary format, those diaries with a few lines to cover the day's events - where did Twitter get their idea from. A few lines every day is far easier to achieve than a page or more per day. I should know, I've been at it long enough.
Parameters, as any writer will know, matter.
They contain what might otherwise be a stream of never ending unconsciousness
Deadlines and word counts are helpful.
I wish I could do the research against the clock too. I have to give X hours to each topic, I would happily give x weeks and a some stage in that week I'd see I could press on for a month. I get this way when my curiosity is taking me off on a mental ramble.
The idea of keeping an objective, digitally record of everything you do does intrigue, not because of the data it captures, though I've had a few years that would be fun to re-live, but what it misses out. This is the idea of a researcher at Micrsoft who is recording his every action (and motion). However, the process misses out how you feel, and what you think.
And would have to be deactivated going through airports, going to the bank or Post Office, swimming with the kids in a public pool ... there's quite a list.
'It's a matter of love,' wrote Nabakov, 'the more you love a memory the stronger than memory becomes.'
How is such strength afforded a memory that remains on the surface of the mind, as there is no need to make the mental effort to embedded it, or to recall it, other than watching it over like a movie. A very bad, very dull, badly lit and performed movie.
'By having everything in e-memory you don't have to remember anymore.'
On the contrary by short-circuiting the implicit, instinctive natural memories making myelination process of the brain you are replacing something fluid and static, albeit it a multitude of snapshots.
I gave up buying the Guardian on Saturday after a decade or more of doing so in favour of receiving the New Scientist every week; it is simple.
Too much that I read in the paper I know already and the Colour supplement's target audience is the bottom of the bin.
I am rewarded this week with
- a) the news that Google have digitised 5 million books
- b) a piece on blogging 'Dear e-diary ... '
This ought to be how anyone who blog begins their entry, 'dear diary;' blogging sounds like something Morris Dancer's do in slippers after hours behind the pub.
Alun Anderson passess through the history of the diary with some clumsy thoughts on such things becoming popular gifts in the 1820s and the number of diaries inviting us to buy them at this time of year on supermarket shelves - actually I find the Academic diary is more popular in late August.
In one respect he is right; along with New Year's resolutions, keeping a diary from January 1st is up there.
Of course, we all decide to do this on the 5th or 6th so have to invent an entry or three or four for the previous days. I've just been looking on shelves where old diaries are stored ... (this stuff gets an outing once or twice a decade). For reasons suggested above, some of the first few days of the New Year draw a blank, though I appear to have an unbroken record for the 5th and 6th of January since 1976. (I should add that the diary record over 34 years has about 13 years of blanks, so I'm not such an obsessive.
I have an unbroken run from 1983 to 1987 and 1978-1982 are complete, but largely little more than a five liner in a Five Year diary.
September 1979 is interesting though, short of the technology, I just about achieved what Gordon Bell, a senior researcher at Microsoft is up to ... recording absolutely everything that ever happens to him with a digital camera strung around his neck. (I trust he'll call it albatross).
We've seen how relentlessly dull TV manufatured life can be from Big Brother, why will Gordon's life be any better, or will the presence of the digital recorder prompt him into doing something 'worth recording,' i.e. mucking up any science he may think is going on.
What I did, not knowing for how long I'd do it, was to open the parameters of my diary page entries, from five lines every day, to an A4 sheet (no more, never missed), to as much as it could take; it took a couple of hours to write every night, which would of course lead to that vital practice of reflecting on the process of writing itself. That and every bust ticket into town (Newcastlte), the Commodores ?! Tuxedo Junction. And the 'swimming baths.' (sic). A play at the Gulbenkien. Godspell at the Theatre Royal. A Mars Bar for 3p.
Totall Recall: How the e-memory revolution will change everything.
No it won't.
All the years I Twittered into a Five Year diary (about 60 words), my aim was to put in something that would remind me what happend that exact day; I'm forever staggered how I've achieved this on very little indeed. It requires a key, not the detail, just an Alice in Wonderland key that opens up the rest of it.
This is what Microsoft should be thinking about, not oceans of everything, but the meaningful flotsam and jetsam, that and the person saying what they think and feel about what is going on. Find me the third-party device that can record thoughts, feelings and dreams - it's a thing of fiction.
This item is written by the former editor-in-chief of the New Scientist, Alun Anderson.
It amuses me to see that the new New Scientist editor-in-chief is Roger Highfield. I don't suppose he can tell me what we ate when I had dinner with him in November 1984 in Wood Green (give me a sec) ... I can. And curiouser, and curiouser, though there's not a jot recorded on what we spoke about that night, I've an inkling I could share.
It is empowering to know I can ferret around in an old diary for ten minutes to get these answers; doing the same with some 16000 blog entries saves me a few moments. Away from my desk, diaries or the Internet however, I'm sure that all this ferreting around in the past has kept these memories accessible.
Gordon Bell will eventually unconver some patterns 'you would never have gleaned unaided;' I feel I'm ahead of the Mircosoft game.
On dedicated diarists
In the Guardian Review in March 2003 William Boyd discussed the journal. I know this because it caught my eye on 9/03/2003 and I gave it a thorough blogging.
There are many sorts of journal (wrote Boyd):
- journals written with both eyes fixed firmly on posterity
- journals designed never to be read by anyone but the writer.
- journals content to tabulate the banal and humdrum details of ordinary lives
- journals meant expressly to function as a witness to momentous events of history.
- journals that act as erotic stimulants or a psychoanalytic crutch
- journals designed simply to function as an aide-memoire, perhaps as a rough draft for a later, more polished account of life.
But buried within these varying ambitions and motivations is a common factor that unites all these endeavours - the aspiration to be honest, to tell the truth.
The implication being that in the privacy of this personal record, things will be said and observations made that couldn't or wouldn't be uttered in a more public forum. Said Boyd.
(Wherein lies the blogs fundamental flaw. Do you tell the truth? Or skip the truth and become inventive with it?) Say I.
Hence the adjective "intimate" so often appended to the noun "journal". Said Boyd.
The idea of secret diaries, of intimate journals, somehow goes to the core of this literary form: there is a default-setting of intimacy - of confession - in the private record of a life that not only encourages the writing of journals but also explains their fascination to the reader.'
Wherein lies the lack of interest in the blog as academic record and reflection; it is your reflection and your record. If on paper it would be in an exercise book or an arch-lever file. Without some truth, some revelation, some disclosure, even exposure, it is but a carapace.
Seven years ago I invited people to comment, formed a group and promised to read the journals given below.
Few fellow bloggers came forward, it's a Long Run, a life-long marathon, not a thing you do as a relay team or with someone on you back.
Seven years on I may read some more of the journals listed below and see what insights it offers this blogger. I suspect I've read everything there is on Evelyn Waugh and Virginia Woolf - and everything they wrote (though I'm yet to jump into the River Ouse with my Gant raincoat pockets full of rocks. A passing thought as I walk the dog most days where the lady drowned herself).
William Boyd's to Ten Journal Keepers
'It mimics and reflects our own wayward passage through time like no other writing form.' Boyd says.
'You have to be dead to escape the various charges of vanity, of special-pleading, of creeping amour-propre.'
The blog I kept for a decade and a bit more Sept 1999 to early 2000 spiralled a non-chronological 'dump' on 37 themes.
Occasionally I take a visit; it's like digging around in my in-law's attic (they give the appearance og having kept everything they ever read. They are voracious readers and are in their eighties)
A blog for me is:
- A record
- A journal
- An aide-memoir
- My deleterious exploits
- The past (every memory gathered in, every book read, every film seen).
- Dreams analysed
- My mental state
- Every stage and phase of growing up dissected.
This OU Blog does have an educational remit. For me it's an attempt to be bustled onto the tracks from which I became derailed. Perhaps. Or a compulsion to empty the contents of my Brian down any drain that'lll take it.
That, and I don't know what I mean until I've said it.
All this and I'm yet to get my head around the Opinion Piece in the New Scientist. 'Dear e-diary, who am I really?' and the potty idea of slinging a digital camera around your neck to record your every living moment.
Two things it vitally fails to pick up: what you think and how you feel.
Long live the diary, blog, journal-thingey.
12 months ago I was preparing to apply to West Dean College to study an M.A. in Fine Art, perhaps, now that I begin to look at the diaries of Paul Keel and Keith Vaughan this is where I should be.
E-portfolios, good or bad thing?
Could they not become unduly burdensome? I have this image of us turning into snails with this vast aggregations of information on our backs (even if it is digital).
Are they for everyone?
New Scientist this week (16 OCT 2010, vol 208. No. 2782) puts 'Life Logging' into its '50 Ideas that will change science forever'list.
It all started with Vannevar Bursh in 1945 with something he called 'an enlarged intimate supplement to his memory.' Fifty years on Bill Gates is quites as saying 'someday computer will store everything a person has ever seen and heard.' Somewhat over ambitiouisly (especially as it went nowhere), in 2000 I registered domain names 'The Contents of My Brain' and 'TMCB' thinking that there could be a place for an electronic diary, scrap-book, journal, album thingey.
I lacked the wherewithal or ambition to develop this further, in any case, I recall meeting the folk from Digitalbrain who seemed to be doing a good job of it.
Does there need to be a market leader?
Using a variety of platforms are not e-portfolios being achieved?
Some people look forwards, some look back.
Which kind person succeeds? A sparsely filled e-portfolio might be a good sign - they are getting on with doing.
And whilst I'm a fervent Futurist, is there not a place for real portfolios (artwork), albums (photos, including those framed and on the wall in a real gallery), books on shelves and files in trunks.
I recently found my H801 file, March 2001. Course work printed out, the few articles sourced online printed off, even a painfully thin listserve thread forum message thingey. And an assignment on DCode a CD-rom for schools that won national and international awards including a Palm D'or for Multimedia at Canne in 1998).
Had I put this online would I have referred to it over the last decade? Instead serendipty leads me to finding in in a box in the garage. Does an eportfolio facilitate serendipty, or is the process of loading it with 'stuff' going to be too prescriptive so that ultimately it narrows minds, rather than opens them up?
Old news keeps like fish.
Where does this expression come from?
Does it apply to course work too?
Even if I had an e-portfolio of what value would my old History, Geography and English A' levels essays be? Do they have more value digitised and online than in a file in a box in garage by the sea?
The brain does something e-portfolios are yet able to do well, which is to forget stuff, to abandon content yet be prepared to re-link if required to do so.
Time to quiz the neuroscientist me thinks.
The OU has stimulated my mind suitably over the last seven months to oblige a subscription to the New Scientist.
I was picking it up every other week for the Web Tech and other 'e-' related topics. These now feature regularly. My wife has ten years in medical market research, though not a Scientist, she will often have an opinion on anything that touches her world of work. It is better read that the weekend colour supplement. In fact, I've ditched the Guardian once a week for the New Scientist once a week with all other stories and news prompted by a sentence on TV, a couple of sentences on the Radio and a paragraph or two online.
Beware the Irresistible Internet
Is it addictive?
Expecting or wishing to look at numerous e-learning style products for H808 I found I had spent 3 hours today doing this with Dropbox and Facebook. I wish I hadn't. I haven't even started to make Facebook sing, so would prefer to exit in tact. And I suspect that Dropbox, like Amazon, Google, Facebook and Twitter is just a neat trap and that within six months we will be enrolled into a myriad of appealing, complementary services that we'll be paying for by subscription.
- technology-dependence clinic (Richard Graham)
- young men stuck in multiplayer online gaming environments
- Women and adolescent girls using instant messaging platforms and social media compulsively
- obsession with screen-based media (Ofcom)
- Blackberry-addicted white-collar workers
Hear say or fact? Not evidence and the citations are sparse. But of interest.
- Is there such a thing as an OU obsessive?
- A blogging obsessive (certainly).
- If you have an obsessive nature.
'Now, the potent combination of omnipresent technologies and our addictive nature means more casualties look inevitable.' Paul Marks. Senior Technology Correspondent
Marks, P. (2010) New Scientist. Volume 207. No. 2777. pp24-25.
Your inner voice.
How talking to yourself makes you smarter.
The Voice of Reason. Robson. (2010)
I was initially attracted to this edition of the New Scientist as the cover story offered to shed light on the value (or otherwise) of what some term ‘stream of consciousness’ others ‘this voice in our heads.’ Of what value is it? And if I can type as fast as I can think it is this a true reflection of what I am thinking, at the pace at which I am thinking it – or does the process lose something in translation? Using how we think and what we verbalise is given value here, which ought to bolster the views of H.E. institutions that ‘reflection’ has a purpose. The article also explains why we need to give things terms, though I’m also always curious to know why certain words last while others do not. If I’ve understood the ideas correctly then there is a suggestion that loose terminology, words for concepts that are not clear or still debated, are counter-productive, we need to be clear that our interpretation of a word, even something as simple as the colour yellow compared to orange, or hues of the colour blue, match the understanding that others have.
‘On average, 70 per cent of our total verbal experience is in our head.’ Boroditsky (2010)
Language helps us to think and perceive the world.
Naming objects helps us categorise and memorise them. Lupyan (2010)
i.e. things (concepts and objects) are more easily thought about if ‘verbalised’ through having a name.
However, labelling can also bury the detail. Lupyan (2010)
i.e. we humans work best at the macro rather than the micro level of terminology?
‘Labelling objects helps our minds build a prototype of the typical object in the group at the expense of individual features.’
Language shapes perception, argues Gabriella Vigliocca of University College London. Vigliocca. (2010)
The pumpkin test. 80% got the object from seeing it alone. 85 % of those who saw it and were told its name got it. While those who had what they could see in one eye ‘scrambled’ only achieved 75% suggesting that a visual with a verbal clue helps to anchor the object in the mind.
‘It seems that words prime the visual systems of our brain, conjuring up a mental image when it is seen’. Vigliocca (2010:32)
Boroditsky (2010b) recently found that Russian speakers, who have two words for different shades of blue, really are faster at discriminating between the different shades than English speakers. (The once discredited Whorfian hypothesis). The effect disappeared when they repeated a long number to themselves, as this interfered with their linguistic capacities.
Fundamentally, knowing the name for something helps identify it. Lupyan (2010)
‘It seems that our inner voice changes the way we experience the world. Language is like augmented reality – an overlay that changes how we think, reason and see’. Clark (2010:33)
With the above in mind I started the following list with a view to developing reasons for not using the word ‘stakeholder.’ With no end of this list in sight I may need to change my opinion, I may not like the word, but it works. But does it? Whilst ‘stockbroker’ I can see embodies a specific group of people, ‘stakeholder’ for shifts constantly, like a cloud forming under a summer sun.
- shop floor worker
‘Up to 80% of our mental experiences appear to be verbal rather than visual or emotional.’ Hurburt (2010) from the University of Nevada, Las Vegas.
‘It’s like a guidebook that has been developed by thousands of people before you, who have figured out what is important for to survive and adapt to our environment.’ Clark (2010)
Do you work with the radio on or off?
With the TV on or off? Or in an Open Plan office? Do you prefer a library or study? Can you work as you commute? Or on holiday?
Based on what we have learnt above what impact might this have on what you are thinking?
Does it depend on how easily distracted you are, how focussed? Work (study) in an environment that is relevant to the task and this enhances it whereas work (study) where verbal noise is a constant distraction and you cannot (or could not) work so well?
Clark, A (2010) Language and Cognition, University of Edinburgh. Interview for New Scientist. 2776 (4 Sept 2010)
Boroditsky, L (2010a) Interview for New Scientist. 2776 (4 Sept 2010)
Boroditsky, L (2010b) Quoted in the New Scientist. 2776 (4 Sept 2010) from Proceedings of the National Academy of Sciences, vol 104, p7780
Hurburt, R (2010) Quoted in the New Scientist. 2776 (4 Sept 2010) from Psychological Medicine, vol 24 p385.
Lupyan, G (2010) Quoted in New Scientist. 2776 (4 Sept 2010) from Psychological Science, Vol 18, p1077.
Lupyan, G (2010) Quoted in New Scientist. 2776 (4 Sept 2010) from Journal of Experimental Psychology: General, vol 137, p348.
Robson, D. The Voice of Reason. pp30-33 Cover Story. New Scientist. 2776 (4 Sept 2010).
Vigliocca, G (2010) Quoted in New Scientist. 2776 (4 Sept 2010) from Psychological Science, vol 18, p1007.
An hour in the middle of the night has been spent reading through the first task and all the various forum entries in H808 'The E-Learning Professional.'
This and applying for a job. Stymied by the need for three references. I've been such a hermit these last few years I worry that beyond family and friends the only reference I could get would be from my hairdresser and she might say something like 'he may be on time but I know he's seeing the barber down the road as well.'
Three dreams over the last ten days are bugging me - my lengthy reflection on these will go into the WordPress Blog (unless they prove to have something to do with the OU). I use a 27 point survey that usually reduces the dream to some mundane conclusion, though occasionally offers something more profound.
Ever on the look out for 'e-' words I spotted 'e-nose' in last week's New Scientist.
The e-nose refers to an 'electronic nose' rather than an 'electronically enhanced and largely online nose' that is the 'e-' of e-learning. The e-nose can identify certain scents electronically, it transpires ... (though not across, the Internet) ... yet. It wouldn't surprise me if a Google-e-nose were developed that could be used to search for and then offer recipes for food from your fridge that has escaped its packaging. Hold it up to your webcam and Google will advise.
The following was written out long hand with an ink pen.
I wonder if there is a stylistic difference, greater fluidity? (My son had squirreled the lap-top away and being the dead of night I didn't want to disturb him).
Is there software that can spot the stylistic difference of something written directly into a word-processor, like this ... or written out long hand, like this:
Whilst reflection is meant to help tackle complex problems, what if the issues are so chaotic, long term and intractable that far from helping to resolve a problem the act and habit of reflection simply re-enforces the mess?!
How much of it is online?
And how much of it is even electronic and/or enhanced?
This happens to be a reflective note being written long-hand onto a recyled A4 ruled pad of paper. It is anything but electronic, or digital. Nor, as yet, is it shared or offers any chance of interaction, let alone collaboration with a group of friends, community of fellow students or the wider world.
The most important part of this experience is taking place in my head and is either one step behind, or one step ahead of this writing process. It is stream of consciousness. It is a singular, lonely and individual occurrence from which little will be gained by sharing it.
This is it: learning in which the 'e' is highly tangential.
Indeed, I'd go as far as to say that the 'e' component of my online learning, or web-based learning, or iLearning experience with the OU thus far is one in which the online quality of the process can be as discretely packaged as you would a book, a lecture or a face-to-face chat with a fellow traveller - it is one part, even a distinct part, an entity with barriers, parameters and a physical presence.
It is a part, not even a large part. But a catylst. A resource. A tool. A track. (a word-processed addition here)
An audit of how this learner spends his time studying shows that half is off-line doing that all too traditional act of reading and taking notes; that of the remaining half another 50% is spent at a computer keyboard sometimes not fully aware or bothered about whether I am working online or off, using software on my hard-drive or the OU server.
(a hand-written omission here replaced with the following while typing online)
And if I continue this fractal-like halving of time spent studying, at what point do I reach 'e-'
And does it matter?
The 'e-' is the fleck of saffron in a risotto.
Isn’t ‘re-invention’ the word? (Rogers, P114 & P115, 2002)
Not wholesale repurposing, but as Rogers puts it 'It should be acknowledged that rejection, discontinuance and re-invention frequently occur during the diffusion of an innovation and that such behaviour may be rational and appropriate from the individual's point of view.' (Rogers, p114 2002)
I wonder how my experience might have been with a group of colleagues or friends, signing up together ... but might this too ‘spoil the party.’ And how over a longer period fellow students would be emailing and messaging and getting on the phone ... let alone meeting up.
This fascinates me primarily because I am convinced that collaboration, sharing, discussion and so on is crucial to a deeper learning outcome. But does this not have to be down to the drive of the individual and permitted by the institution they belong to?
How much motivation can others really offer or be expected to offer?
If neither a carrot or stick will work with adult learners, especially in a online environment, then what do you do? ‘You can take a horse to the trough, but you can’t make it drink.’ As I’m about to take a course on the Psychology of Sport as a Senior Swimming Coach I may gain some further insights into waht motivates people to do something and how outsiders can influence this in a positive way.
And just because we’re invited to drink from this trough once, dos not mean we will do it again, or often or with enthusiasm. Our moods will wax and wane, or commitments beyond the course will impinge.
Deep learning, as I’ve learnt, benefits from, even requires a rapport with one or several others at various levels of understanding – a Subject Matter Expert (SME) or experts, a tutor, a couple of fellow students on the course, and perhaps someone more junior who can be in turn mentored or tutored by us (first years being buddied by a second year, a post-grad student supervising a fresher).
How much this mix can be set by what little the OU or other Distance Learning Provider knows about an individual is quite another matter.
Do you run a call-centre like team of facilitators/moderators ... or aspire to the one-to-one relationship of tutor or governess to student mimicking some land-owning/aristocratic model of the distant past? Where is or how can that rapport that can work between student and tutor be recreated here? Or is this something for a DPhil?
A free-for-all would create imbalances, inevitably ... for the institution. But whose experience are we prioritising here?
Whilst a balance must be found, if the best outcomes are to give tutors and SMEs much more time online to forge relationships then this should be - a good coach attracts the best athletes and attracts the interest of other coaches. How does she do that? (Expertise, training and personality ... enthusiasm, putting the athlete at the centre of things)
Perhaps by pursuing ‘educational social networking’ institutions are shooting themselves in the financial foot?
The time put in to make a freer networking between students, tutors and SMEs, with students in different time zones and different priorities would be prohibitive. Undergraduates studying on campus, in a homophilous cohort, with fewer worries (other than debt) don’t know how fortunate they are to have this opportunity to study, probably for the only time, before the life of the wider world impinges.
Are Personal Learning Environment (PLE) a way or the way forward?
If I have this concept right, i.e. with the formal relationships and tutor relationship given equal potential, the tools in one place on the same homepage is a suitable progression from the VLE) Perhaps OU students are doing this anyway by starting at their own Blog or Home Page and simply anchoring the pages from the OU that matter most to them?
The New Scientist is running an interesting essay in its current edition which touches on all of this.
New Scientist (week 10th July 2010) has a piece called 'Generation F' by Richard Fisher (2010).
* 400 million worldwide ... on social networking sites.
* The importance of weak ties as well as close ones.
* The time it takes to forge 'reliable and trustworthy' ties.
* The value of 'acquaintances' to provide relevant and trustworthy news/information.
The article is prone to the some hyperbole:
Social networking sites (Facebook, LinkedIn and MySpace) the 'harbingers of a sea change in our social evolution, in the same way that the arrival of language informed our ancestors.' (Donarth, 2009)
Danah Boyd (2009) describes Facebook as ‘an essential utility like water or electricity.’
Academics are just as guilty of this kind of thing, there’s been plenty of it in the reading for H807 the democratising of education, ‘starting the world anew’ ala Tom Paine etc: and claims made in the last ten/twenty years regarding ICT and education, what it could do, will do ... but hasn’t.
The essay is of value though for how, and if, social networking can be used short-term purposes:
'Online social networking appears to be 'very good for servicing relationships, but not for building them de novo.' (Dunbar, 2010)
H807 tries to use an ‘educational social networking’ approach, or does it. Perhaps it is deliberately more self-contained than this. Though with emphasis on authors such as Salmon (2002) and her model for e-tivities, undue emphasis is put on getting people talking and working together? Is that so necessary.
Isn’t experience showing that this is wishful thinking?
The OU must have research on this. Why do more people quit a an online distance learning course (20-50%) compared to a traditional distance learning course? What are the views on conversations, synchronous or asynchronous between fellow students and students in the wide OU community and tutors?
At various times, the ‘weakest links’ to fellow OU students through the OU blog has produced some useful support and insights for H807, yet engagement through our own Cafe/General Forum can surely only be described as minimal?
Whilst deeper learning experiences do come from sharing (like this), it isn’t happening to the degree the OU would like?
Collaboration between some random people I may meet at the bus stop when the service is delayed is not the same as forging an academic bond with some one or some many who are equally engaged with the material, whether their opinions are the antithesis of mine would be immaterial – indeed, disagreement would be better, it feeds discussion. This is NOT a criticism of H807, we have a common purpose, we have elected to do H807, there is a common profile intellectually and absolutely the variety of life experiences enriches the experience. But clearly, as individuals, our approaches to learning, IT skills, time allocated to the task and for many other reasons will and does negate against certain ways of learning. Such as this.
If on the one hand the wishes of some students, maybe most, to stay at arms length aren’t the wishes or hopes of others who would like to engage with a wider circle being denied?
The sought of relationships between students that the OU is hoping for can surely only developed over a few years rather than a few months.
Jeff Hancock (2008) of Cornell University '... found that those with Facebook access asked questions to which they already knew the answers or raised things they had in common, and as a result were much more successful at winning people over.' (New Scientist, 10July2010).
We experienced the ease with which we could share personal information, there was no drilling or phfishing for information, but clearly I will know more about some people than others. It relevance is another matter, the buy-in to these people could eventually result in a bond of sorts, at least as working on this platform is concerned. I would have to look back through the way we respond to each other to see if the above occurred ... deliberately asking certain people certain things even though we knew the answer, as a catalyst to conversation. This does not work discussing trivia such as pets and the weather (though I’ve indulged in plenty of that too ... it doesn’t lead to conversations on costing programming, what Vygotsky means about scaffolding or whether we are fed up with e-tivities, e-granaries, e-moderators ... and e-jobs.
Mid-way through the unit we read Elliot (2008) and I took an interest in the way 'lifelong learning' functions.
I was looking at this as an adult learner environment, the merging of social, family and work through social networking sites and the communication habits and styles of all three merging into and becoming a messed up single entity. Historically it wasn't long ago that work, family and social words were one ... fifty years ago, seventy or a hundred years? No more.
Both of these points, revealing more and the merging, or coalescent, or the dropping of barriers between these spheres is changing behaviours.
'Increased visibility also means our various social spheres - family, work and friends- are merging and so we will have to prepare for new societal norms. 'Well have to learn how to live a more transparent life.' (Holtzman, 2009)
The idea of 'Exposure' was used be Ellen Levy in 1999 (Levy, 1999) after she had spent a year keeping a blog and photojournal, then a novel activity. (Washington Post, 24th September, 1999).
What an employer, parent, friends or colleagues make of this is another matter, but then again, one day we’ll all be walking around with our DNA profile on a dog-tag (or embedded under our skin on a microchip).
The relevance of all of this?
How far can the individual be indulged within the parameters of an online course, that must retain students and prove its worth to the institution (financial, academic, members), the students (worth it financially, academically, career wise ... and personally) ... and the wider community (grants, knowledgeable workforce, content and informed citizens)
Je suis comme je suis
Je suis faite comme ça
(Jacques Prevert, 1946)
I am what I am, I was made this way.
Donath, J. New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com, p40. From Journal of Computer-Mediated Communication, vol.13, p 231)
Dunbar, R. (2009) How many friends does one person need? Professor of Evolutionary Anthropology at the University of Oxford. Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com.
Elliott, B. (2008) Assessment 2.0: Modernising Assessment in the Age of Web 2.0 [online], Scottish Qualifications Authority; available from http://www.scribd.com/doc/461041/Assessment-20 (Accessed 1 February 2010).
Ellison, N (2007) The Benefits of Facebook "Friends:" Social Capital and College Students' Use of Online Social Network Sites
Journal of Computer-Mediated Communication
Volume 12, Issue 4, Date: July 2007, Pages: 1143-1168
Nicole B. Ellison, Charles Steinfield, Cliff Lampe. (Accessed 11 July 2010) Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com.
Fisher, R (2010) New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com
Granoveter, M, S. (1973) The Strength of Weak Ties. The American Journal of Sociology, Vol. 78, No. 6 (May, 1973), pp. 1360-1380 http://www.jstor.org.libezproxy.open.ac.uk/action/exportSingleCitation?singleCitation=true&suffix=2776392
(Accessed 11 July 2010)
Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com. The University of Chicago Press.
Golbeck, J (2010) Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com.
Hancock, J. (2008) I know something you don't: the use of asymmetric personal information for interpersonal advantage
Jeffrey T. Hancock, Catalina L. Toma, Kate Fenner. Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com. (Accessed 11 July 2010)
Holtzman, H (2010) Massachusetts Institute of Technology. Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com
Kearns, M. (2009) Behavioral experiments on biased voting in networks. Proceedings of the National Academy of Sciences, vol 106, p1347) http://www.pnas.org.libezproxy.open.ac.uk/content/106/5/1347.full.pdf+html (Accessed 11 July 2010) Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com.
Levy, E. (1999) Featured in article in the Washinton Post, 24 September 2010. See more at http://businessinnovationfactory.com/iss/innovators/ellen-levy
(accessed 11 July 2010)
Prevert, J, (146) Paroles.
Pentland, S (2010) Massachusetts Institute of Technology. Quoted in New Scientist. 10 July 2010. Volume 207 N0 2768. www.newscientist.com
Rogers, E.M. (2003) Diffusion of Innovations (5th edn), New York, Simon and Schuster.
Salmon, E (2002) E-tivities the key to online learning. Kogan Page.
Tom Tong, S (2008) Too Much of a Good Thing? The Relationship Between Number of Friends and Interpersonal Impressions on Facebook. Journal of Computer Mediated Communication, vol13 p531-549)
This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.