My spouse and I got into a fun battle this morning about writing methods sections. I said they were difficult to write, she said they were easy, or least straight forward. This could be attributed to the differences between qualitative research that doesn’t follow set protocols (my work) and quantitative experimental work (her work).
I’ve written about why these sections are difficult to write in my book about case studies. I’ve recently started calling this the inclusion/exclusion problem. It’s really quite difficult to figure out what to include in the section and what to exclude. If you go to published papers in the prestigious journal Nature, you find methods sections just reference supplemental files in two sentences…but when you check out those files, they are 80-100 pages long!
When I taught a writing class to engineering graduate students (most early in their career), they expressed confusion to me about the inclusion/exclusion problem. It’s not intuitive how granular to go with the details of methods section. You run into the PB&J sandwich problem. This is a classic classroom activity (I do it in my intro to tech writing classes) where you have students write out instructions for making a sandwich but have to assume different starting knoweledge(s) for the reader. It makes explicit how difficult these problems can be. It shows the ways audiences co-construct meaning, including the ways writing is deeply interpersonal and not simply transmission of information.
As I’m revising a paper for a journal, I’m thinking about this issue. Rather than a length, I usually think about how much trust needs to be established for the readers to agree with what you’re presenting. In my own work, I usually overwrite because I want this trust firmly established. In the current paper I’m revising (which is about vaccines, a line of research that is a hobby of mine), I’m thinking through the computational packages we used and what about these packages do readers of the journal (qualitative researchers mostly) need to know to make them trust the tools.
There isn’t a length, then, that I’m after but a considered use of the tools and techniques, including explaining why the tools/techniques fit the research questions.
What’s your take?