So far, so good: Increased physical activity = productive thesis writing

2 May

I promised a little update to let you know how things were going.  So my previous post was about joining the gym as a thesis writing strategy. It’s been almost two weeks now, and I can honestly say it’s been great. I go for my usual morning run as soon as I wake, then I have alternated between taking a yoga practice class (Mondays & Thursdays) and just being in the gym (every other day). I have one exercise free day. My meyoga practice is around lunchtime and I like to go to the gym around 3pm. Dave the dog gets little walks often and I also go to my photography course on a Monday evening. So my progress to date:


Thesis writing going very well (2 chapters revisited, edited & sent to supervisors)

Focused and concentration levels good

Feel energised

Mood happy

Shoulder pain improved (but is still there)

Becoming more toned and feel less sluggish (as I often do after a whole day in front of the PC)

So all-in-all, increasing physical exercise has been very positive, with nothing negative to report. Even the time this takes cancels out the time spent away from writing due to all the above. This weekend is my weekend off, so am heading off  in the motorhome. PhD-free and my exercise will be long dog walks (9 miles planned on Saturday & 12 miles on Sunday)


My strategy for ‘writing up’ my PhD thesis

18 Apr

So I’m at what I consider to be the final leg of my PhD – the ‘writing up’ stage. I don’t really like this term because it implies that I haven’t been writing. I’ve been writing since I started and I have every chapter in draft. However, draft is the word – some of these chapters haven’t been touched in months, years even.

As a full time Lecturer, I am extremely fortunate to be given 12 weeks protected study time to ‘write up’ (thank you University of Dundee!). In other words, to revisit all my chapters,  revise,  then produce a first full draft by the end of the 12 weeks. I still have some work commitments in terms of my post graduate students, but this was my decision.

I’m excited but also apprehensive. While I’ve had blocks of study time throughout my PhD, I haven’t had this much in one go.  Having a big chunk with a very specific goal at the end makes me nervous – mainly in relation to the concentration I‘m going to need every day. I must also add that due to the time I spend in front of my PC I have developed what I think is coined ‘Mouse Shoulder’. My right shoulder now burns constantly and I worry about the impact of even more time at my PC.

So, I have come up with a plan to help me cope with all this. Basically I’ve decided to increase my physical exercise, therefore I’ve joined the gym.

Exercise is an integral part of my life anyway – I run most mornings and I practice Ashtanga Yoga (albeit not very well). I also get out and about with Dave – my dog (and my camera). However, for the next 12 weeks, class2I don’t think this is going to cut it, so I’ve just joined David Lloyd’s gym. I’ve had a gym membership years ago (only for using gym equipment), but decided to stop when I realised I was there more than the staff (damn those endorphins!). I had a taster Christmas special membership with David Lloyd and I loved the range of classes (apart from the spin class – that was just nasty!) and the times they were on. Plus, it’s only a 5 minute drive from home. I’m hoping this will also encourage my teenage daughter to actually use her membership too!

Anyway, with the goal of staying healthy and producing a first full PhD thesis draft, here is my plan for the next 12 weeks:


I am generally a healthy eater anyway, but alongside this I shall ensure my diet remains reasonably healthy and I’ll keep hydrated. Those who know me, know I love my red wine and a glass or two of bubbly – although I only tend to indulge at weekends. I plan to keep doing this! I do know of people who have abstained completely while writing up – this I couldn’t/wouldn’t even contemplate doing!

So that is my plan…. It sounds a little rigid, but I do need structure. Obviously, I’m flexible when I take my exercise classes and will work around that and I plan to fit in the occasional swim and sauna. I also might (probably will) have a little lie in at the weekends!  However, all-in-all, I’m hoping this will keep concentration and motivation levels up, prevent procrastination, distraction, cabin fever and worsening ‘mouse shoulder’.

So…. bring on the next 12 weeks I say! I shall let you know how this is going after a week or so (if you are interested that is). I’m really keen in hearing about the type of strategies you have tried (or trying) and if they’ve worked (or not!). Oh, also – if anyone knows about mouse shoulder, I’d be eternally grateful for any advice on how to make it better, or at least to make it stop it burning #ouch

Changing PhD Supervisors: Help or Hindrance?

8 Apr

I used to think the latter.

In an ideal world, one is assigned to two or three supervisor’s right at the beginning of one’s PhD, who would see you through to the very end. Most of us know however, that doing a PhD never really goes perfectly to plan. I know now this is just another part of the fun (!)

I have changed my main supervisor 3 times now – through no fault of my own I hasten to add. Well at least as far as I know. I ‘lost’ my supervisors simply due to them leaving for pastures new. Although I had the option of continuing and being supervised from a distant, I decided (after thinking long and hard) that this wasn’t something that I felt would be good for me. Thankfully, maintaining a certain level of stability, one co-supervisor has stayed with me from the beginning. I have to appreciate also, that having 3 different principle supervisors has also been a tough for her too – but hats off to her, she is still with me!xxx

Admittedly, this has been far from easy as I have noted previously in a previous blog post. Indeed, I am not ashamed to admit that I have shed many tears over it. I admit that for totally selfish reasons, I felt abandoned and often felt I was being forced to take a path that wasn’t on my agenda.

My first supervisor change was extremely difficult as my new supervisors challenged me in a way I hadn’t been challenged before. I was unable to defend many things. This then led me down a backwards path for quite a long time. In time however, I realised that I needed to be led down this path in order to be able to move forward.

My second supervisor change was daunting, as I expected to be led onto this backwards path again. This however, was not the case. Sure I have been challenged, but this time it was welcome. I was now sure of what I had done, why I had done it, how I had done it and how it has contributed, therefore the challenging questions were welcome. Yes, of course, the reins have been pulled a little while I re-think some things and go back and forth to revisit work, but I can see how much stronger this is making my work. Although…. My literature reviews are a pain in the ass – they have always been a pain in the ass. I thought in time, my literature reviews and I would develop a mutual understanding, but it looks doubtful….

Anyway, I digress…..   Now that the dust has settled, I look back upon my supervision changes and challenges and have come up with this:

  • Supervision is subjective – accept that. However, there is more than one way to skin a cat… That said, also respect each supervisor in their own right. More about styles of supervision here
  • Set ground rules together: What was expected from previous supervisors may be different to what your new supervisor expects.  Similarly, establish what your supervisor will do for you in terms of reviewing work, supervision sessions etc. It may not be the same as your previous supervisor
  • Don’t say “well my last supervisor said……”
  • Embrace differences of opinion (no matter how hard this is). If your supervisor doesn’t like or agree with what you say or have been doing, then defend it. If you can’t defend it, then revisiting is probably what you need to do
  • Keep a research journal. Your PhD is an iterative process. I don’t know about you, but sometimes I can’t even remember what I did said yesterday. You need to know what you did, when you did it, the reasons why you did it and what the outcome was throughout the whole process. This is your research journal
  • Don’t be frightened of going back and revisiting/redoing work. It’s tough, really tough – but it’s an important part of the process. By the time you come to your viva, you will be able to justify what you tried, what didn’t work, what you did about it etc.
  • If you don’t understand, don’t be afraid to say so!
  • Remember all your supervisors in your thesis acknowledgements – they all helped you get where you did

So, I’m not going to lie, changing supervisors has been difficult, really difficult at times. However, on reflection, I feel very fortunate. I know I have had an amazing opportunity working with different very well respected professionals, all with different expertise and all with very different approaches to supervision. I know I can articulate my argument and be very clear about different approaches and processes, not just thzzze ones I took, but of all the others I was challenged about. I feel confident when being asked g questions – in fact I welcome them (most of the time!). This I know will (I hope!) stand me in good stead come viva time. So to answer my initial question  – (only) from my experience, this has been a help rather than a hindrance.

More about changing PhD supervisors by @eljeejavier

If all the above fails and you really cannot continue with your supervision team, you may find this post @thesiswhisper  helpful: How to tell your supervisor you want a divorce

I would love to hear from you so please do leave comments. Have you changed PhD supervisors? How was it for you? Did you have similar or different experiences from mine? Are you going through a challenging time with changing supervisors? Is this just an initial transitional challenge or perhaps it goes deeper than that?

It would also be great to hear from supervisors – although I have blogged from a student’s perspective, I have absolutely no doubt that from a supervisor’s perspective, gaining a student who has had previous supervisors must be very challenging!

Member checking v’s dissemination focus groups in qualitative research

27 Nov

Historically, member checking (also known as member/participant validation) qualitative research findings has been viewed as an important aspect of establishing accuracy, credibility and validity (Koelsch 2013). Simply, member checking occurs when the researcher returns to participants to seek approval that the researcher has accurately reported their narratives and to gain further comments.

I hadn’t given member checking much thought (I conducted focus groups with members of the public and healthcare professionals in addition to one-to-one interviews with newspaper journalists and editors). It wasn’t until I had finished my preliminary data analysis when it was suggested to me by my supervisors.  This, I admit wasn’t a welcomed suggestion mainly due to the challenges it would likely cause. However, not being one to dismiss supervisor suggestions, I took myself off to explore this concept further. The outcome of this exploration was that I would not conduct member checks as I could not see a clear benefit. Here was my rationale:

  • As some of my focus groups were opportunistically undertaken from already-formed social groups, locating the same participants was likely to be impossible. They were also conducted quite some time ago
  • Geographically, my focus groups were conducted in another part of Scotland. I didn’t have the time or the energy at this stage of my research to travel back there for this purpose, especially when I questioned the effectiveness
  • Just say I was able to locate my participants, I could potentially cause them discomfort having to listen to sensitive issues being discussed, especially around my interpretation of their narratives
  • My participants could also feel uncomfortable hearing their own words
  • Exposing my preliminary findings and interpretation to my participants could make me feel uncomfortable (not a big deal but nevertheless the potential is there)
  • My participants may have forgotten they have said things therefore not be able to validate them. Alternatively, they may unintentionally wrongly recall what they have said and change the nature of the discussion that actually took place
  • My participants may request the removal of valuable data from the focus group. Also, they may have changed their perceptions about something and request that their narrative or part of their narrative is removed
  • The same group dynamics can never be recreated. Since group dynamics and interaction is a key component in my  focus group data analysis, it was deemed impossible to recreate the same group dynamics

However, this then left me with a gap.  Although I made the decision not to conduct member checks, it didn’t mean I could ignore the issue. This meant further reading and exploration. I also took to twitter to help me and received some excellent responses, in particular from Dr Bronwyn Hemsely @BronwynHemsley who had similar experiences.

Taking into consideration all the above points, and also importantly, keeping my epistemological stance of weak social constructionism and methodological approach (Interpretive descriptive methodology) at the forefront of my mind, I knew I wasn’t looking to ‘validate’ my findings, nor did I want to seek confirmation of a ‘truth’. Rather, I wanted to present my conceptual thinking and seek thoughts and ideas as to how I could be further develop them. Equally, I wanted to explore whether I had missed something important.  I therefore went down the route of dissemination focus groups. This is advocated by Rose Barbour (2005) as a more useful method to feedback preliminary findings than member checking. Focus group

So what I did was generate one focus group (6 people (two of whom were original participants)) with a mixture of members of the public and healthcare professionals (reflecting the characteristics of my participants). I prepared a Prezi and presented my key categories from my findings then asked specific questions for further discussion. The focus group was recorded with permission from the group and lasted just over one hour. I also provided light refreshments and gave small gifts as a token of my appreciation. Over all, I found this experience hugely beneficial as it:

  • Helped me explain and contextualise my study as a whole concisely and succinctly (something which has never come easy for me!)
  • Enhanced my analytical and interpretational sophistication through agreement and offers of further considerations
  • Crystallised similar and different perspectives from both the public and healthcare professionalsThinking
  • Helped further consider my findings in terms of what they mean in relation to informing practice and policy today and for the future
  • Was fun for me and those who took part

If you are considering member checking for qualitative research, I would definitely recommend dissemination sessions as an alternative. I’m not however, saying this is the right way and member checking is the wrong way , or indeed the way I did it was the right way – we know there is no right or wrong in qualitative research. What I am saying is that this was the right way for me and my research. I imagine there are various and innovative ways in which this can be done, but hopefully sharing how I did mine gives food for thought. I would be very interested to hear from others their experiences of either member checks or dissemination sessions (interviews or focus groups). Were they helpful or a hindrance?

Qualitative data analysis: data display

20 Oct

The first thing I want to say is that data display was lots of fun!

So my last blog post finished after I had developed and played around with my propositions before moving onto data display.

Miles et al (2014) dedicates 6 chapters to data display (part 2 of their book). I read and re-read these chapters a number of times before I could get my head around everything. Had I not done this, I can see how I may have gone down an inappropriate avenue. Miles et al provide various suggestions along with some smashing examples about how data can be displayed – mainly though matrices and network displays.

For my study, I created matrices (with defined rows and columns). Miles et al describe matric construction as “a creative yet systematic tasks that furthers your understanding of the substance and meaning of your database” (p.113). A key point that resonated with me was that it’s not about building correct matrices – it’s about building ones that will help give answers to the questions you’re asking. To do this, they advise us to “adapt and invest formats that will serve you best” (p.114).

An important conclusion I came to? I didn’t need to use (or fully understand) all the matrices/network displays. I took what I needed to (role-ordered matrices) and combined it with a little of something else (Framework matrices) to allow me to display my data in a way that helped me move on with analysis and progress through to interpretation – always with my research questions at the forefront of my mind (and pinned to my office door).


So here’s what I did: I created a matrix for each main theme (n=4) and each focus group (n=15). In total I created 60 matrices.

My participants were entered along the first row and within each participant cell I also identified key demographic characteristics.  Each subtheme was a column heading. I can’t provide an example of one of my matrices in NVivo as the data is legible, so the image below is a QSR example from their volunteering study.

xxxThe beauty (and massive time saver) of NVivo is that when you click in each cell (number 4 ), the data that you have coded (for the individual within that theme) is displayed on the right of your matrix (number 3). This is referred to as the ‘associated view’. Obviously when you first create your matrix all the cells in the middle will be empty so from the coded data (associated view) a summary needs to be entered into each cell.

For my study, I read through all my coded data and my summaries were developed using the following:

  • Including sufficient detail that was understandable and not overly cryptic
  • Retaining my participants language
  • Sometimes including short verbatim excerpts if I thought it was necessary. All quotes were kept in italics
  • Including my commentaries (in a different colour) about context and focus group interaction

A simple, but important thing I noted when writing my summaries is that not all cells were coded, therefore no summary was required. I always wrote ‘NC’ in those cells so I knew that cells were not empty due to an unintended oversight.

Not surprisingly, as with all stages of data analysis, this process was extremely time consuming. However, by the time I completed it, I had so much more insight into what my data was telling me – for example, the similarities, the differences, the unsurprising and the surprising.  I generally gained a much deeper understanding of what was going on.

However, it didn’t end there. I wanted to compare and contrast my data not only within focus groups, but between focus groups. This I found difficult on a computer screen as I had to jump back and forth across so many matrices.  So….. similar to my propositions, I left my PC and went back to flip chart paper. To be honest, it was a nice break from sitting at my PC.

Another beauty of NVivo is that the matrices can be exported into excel. I did this then transferred them again into a word document (I like prettifying my tables with colours etc. and could only do that the way I wanted in word). It cost me a little more time, but nevertheless, it was worth it. I then printed my matrices out (all 60 of them). For each main theme and subthemes I sellotaped 3 flipchart paper sheets together (so that they were long enough to display all 8 focus groups matrices down both sides) and glued my public focus group matrices down the left hand side and my healthcare professionals’ focus group matrices down the right hand side.

These matrices on the flip chat paper then became my focus for a few weeks. I read them, compared them, returned to the literature, returned to my memos, reflected and took time away to think (long dog walks on the beach helped hugely with this). While I did this, I used the white space in the centre of my flipchart paper (between the matrices) to scribble down my thoughts and concepts. For me, this stage enabled the progression from description to interpretation. I even took them to one of my supervision sessions so I could talk through some of my thoughts and illustrate the process I took to get there. I can show you an image of this as the text is not legible – this is one main theme (with 4 subthemes (column headings (in blue)). The peach rows are my participants:



So in a nut shell, my data display process helped me to get my creative thinking underway for interpretation. I then used these matrices to help me write up my first draft of my findings.

I hope this has been helpful. Qualitative data analysis is so diverse and complex and depends upon a number of variables, particularly your methodological approach so there really is no ‘one size   fits all’. Please do respond to this post and share your experience of the process you took and how it worked for you. Or did you do something similar to me? 🙂

Qualitative data analysis: data condensation (aka reduction)

12 Jul

Rather than trying to squeeze my thoughts and experience of all my data analysis in one blog post, I intend to write shorter (!) posts of different stages as I progress. My last blog post was about developing my analysis plan.  Knowing what I know now, I am so thankful I spent the time doing that!

I am following Miles and Huberman’s approach to data analysis and have the 3rd edition ~ Miles et al., (2014). There are lots of similarities in this edition but also some differences. I prefer  this edition.Mile et al

miles and huberman

The purpose of this post is to share with your my first step of data analysis – data condensation. This used to be called data reduction (Miles and Huberman 1994) but it was changed because data reduction implies “weakening or losing something in the process”.


So immediately following my focus groups and interviews, I took extensive notes about salient factors (more about that in another blog post). From these notes I created a contact summary form as advocated by Miles and Huberman and one of my supervisors which synthesised all this information. This is  a very simple and highly valuable thing to do. I have repeatedly referred back to my contact summary forms throughout this process (if anyone wants the template, just ask). I also transcribed verbatim all my focus groups and interviews myself as soon as I could after data collection. This was a very, very long and at times, laborious task, but again highly valuable for really getting to know my data.


I listened to each audio recording (listening only ~ no note taking). Then I read each transcript (reading only ~ no note taking). Then I listened to each audio re-coding again whilst reading my transcripts. This time I scribbled notes down on a pad and drew various mind maps and diagrams. After all that, I was pretty sure I had immersed myself in my data (even though I hated listening to myself!).

I prepared transcripts for importing into NVivo 10. This involved ensuring consistent format and style and anonymising my participants by allocating each of them a pseudonym and a code to differentiate public, healthcare and media professionals (a blog post about this here). This process took quite a bit of time, but if not done thoroughly, I can see how this could have caused me many problems later on.

1st level coding: I developed a starting coding list based on my theoretical framework and wider literature to get me started (initial deductive approach). I listed these codes onto a coding framework with clear operational definitions so I had a clear understanding of what type of data needed to be assigned to each code. Throughout this stage, codes were revised or removed and additional codes and subcodes were created as new themes emerged from the data (inductive approach). As I revised my codes, each transcript was re-read and re-coded. I made sure at this stage I didn’t try to force my data into anything and that codes and sub-codes were all kept very descriptive.

Pattern coding: This was about working with the 1st level codes and sub-codes so that they could be grouped into more meaning full and general patterns. This process was a little more challenging for me because at times I was aware that my thinking was going a little too fast and that I needed to remain fairly descriptive. I was also frightened about condensing too much and losing some of what I had. However, the beauty of NVivo is that you have an audit trail so if you do need to go back, everything is still there (I saved a copy of my NVivo project at the end of every day). While pattern coding, I examined my data carefully and asked a number of key questions such as: What is happening here? What is trying to be conveyed? What are the similarities? What are the differences? In doing so, I also explored not only the similarities but also the idiosyncrasies and differences. This process took quite a number of iterations before I was happy to move on.

Memoing: to help me through the process of coding, I created LOTS of memos which captured a wide range of my thoughts and concepts. I was also able to link my memos to my data and any external resources such as websites or literature. Again, I cannot stress enough how valuable this has been (and still is). My research journal was also created as a memo.

Propositions: it took me a little while to get my head around what I needed to do here as I have always associated propositions with case study research. This was another lengthy process but has helped so much as I started to gently move from the descriptive stage to a more conceptual and interpretive stage. I went through all my coded data and developed propositions from them – so basically a summary or synthesis of my data. I initially developed 613 propositions then reduced this to 479 following removal of duplications. In order to have a better visualisation of these, I left my computer and turned to flip chart paper. I printed each proposition (within their pattern codes) on different coloured post-it notes and arranged and re-arranged them (lots of times!). This then led me to revise my pattern codes again. Of course, with that, I revisited all my data and yet again re-coded into my final revised pattern codes and sub-codes. Just to say at this point – this doesn’t mean these pattern codes are written in stone. They can be (and will likely be) altered again as my interpretation progresses.

So in a nut shell – that was my data condensation. Obviously we know that qualitative data analysis is not a linear process and requires many, many iterations. While at times, this may be frustrating, it’s necessary and can be fun!

On a final note, if anyone is thinking about a CAQDAS programme, I cannot recommend NVivo 10 enough – I absolutely love it (I cannot comment on any other CAQDAS programme as I have only used NVivo) I know many people prefer manual analysis for a number of reasons, which is absolutely fine. NVivo has helped me hugely to store, manage and interrogate my data (of course it won’t interpret or write up my findings though!). The support you receive from QSR International via many ways is first class also.

I am now in the throes of data display and developing lots of Framework Matrices. Another really exciting stage and one that is continually challenges me on my current thinking.  That will be my next blog post. If you want to ask me any questions about my experience of data condensation, please ask away. Any comments would also be very welcome! I’m really trying to keep my blog posts short, but as you can see, I’m not doing well with that!

Qualitative research: pseudonyms or no pseudonyms?

27 Mar

As I prepared my focus group and interview transcripts for entering onto NVivo, ready to being my data analysis, it occurred just how many participants I had (82 in total – members of the public, healthcare professionals and media professionals). I started to think about how my participants would be represented in my thesis. Obviously to maintain the principle of beneficence, they had to remain anonymous. The use of pseudonyms is as we know recommended, but I wondered if this was a unified strategy or if anyone had other thoughts. So I posed this question on twitter to explore further:Picture1

Some interesting discussions followed and some issues arose that I hadn’t previously considered.

There was a general consensus from the responses (PhD students, researchers and a PhD supervisor) that using pseudonyms was a good idea as it allowed participants to feel like real people.  Consequently, it helped researchers portray their story effectively and maintain that human element.  This of course is key in qualitative research.

However, there were some important considerations raised.  Not everyone used pseudonyms and someone felt that codes were easier for the reader to track and relate to, whereas names could perhaps be easily forgotten.  Others stated that it was important to use both – the pseudonym for the human element, but also codes to differentiate between groups of people.

namesSo how do people choose their pseudonyms? Suggestions included using a random baby name generator from the internet or Google the most common baby names which related to their date of birth (I can see why allocating a ‘Chantelle’ to an 80 year old lady probably wouldn’t be the best choice!). Someone also suggested choosing similar sounding names to their own. I know others have asked their participants to choose their pseudonym, but this can be challenging if the same name is chosen by a number of participants. I also wonder what the implications of this are if that participant is able to identify him or herself in the research findings?

So the outcome of this twitter conversation in relation to my study is that I am using pseudonyms, in addition to codes (which is also what one of my supervisors did). This is because I really want to keep the human element, but as I am analysing my public, healthcare professional and media professional data together and will be reporting my findings in one chapter, it is imperative that the groups are differentiated.

Another critical aspect that needs to be considered is that even when using pseudonyms, participants can still be identifiable, especially if they are from small communities. This is something that I need to be mindful of as some of my participants live in a small community which had experienced a traumatic event and some are journalists working for specific newspapers. Just because I have given them different names, I need to ensure that no one can be personally identifiable (or connected with a professional organisation)  in any way.

A special thank you to @strictlykaren , @Acrobat13, @merry30, @SarahLaneCawte, @gtombs, @Paully232000, @AbigailLocke, @VickiMcDermott and @CET47 for their insightful twitter comments and feedback 🙂

%d bloggers like this: