In researching Basic Writing and issues related to access to technology, I’ve run into a couple of methodologies. Most fall under a similar category. They stipulate the issue with a literature review and then further discuss the issue with classroom anecdotes. The anecdotal evidence may take the form of a case study wherein a few students are provided as objects of study. It may also draw from the experience of one course with student work called upon as evidence. The anecdotes or descriptions are narrative in nature. Other pieces are less pragmatic and more thought-pieces that cover a literature review, stipulate the problem, and then call for change or action.
The Journal of Basic Writing is the main publication for Basic Writing scholarship. Submission guidelines do not mention preference for methodology: http://wac.colostate.edu/jbw/submit.cfm. Then again, other journals in Composition Studies (CCC and Computers and Composition) do not forbid any methodologies, though they do at least list empirical research as an option, unlike JBW. The preponderance of narrative articles does not appear to be a goal of the journal’s but rather something that happens. I wonder if the bulk of submissions are in this style. The articles I have located about technology in JBW seem to rely heavily on those methods with little quantitative research used.
When I interviewed Dr. Kevin DePew about methods used in BW research, he indicated he was not surprised by a reliance on narrative. He speculated that this may be a result of how the people hired to teach BW often come from a variety of disciplines that may value narrative. Furthermore, their own teaching experience may be “trial by fire” which then also lends itself to a narrative tradition in the articles (DePew).
In regard to credibility, I do wonder that BW scholarship is not often cited by other subdisciplines, in much the way that education scholarship is rarely cited by composition studies. Some of the blame for this, though, may be because the journals live in different databases than the journals in use by comp studies scholars. While JSTOR is the go-to database for my fellow students, I have to troll through ERIC for JBW articles. An additional concern I have is that there are plenty of online journals but they are not housed in the academic databases even though they too are peer-reviewed. There is a plethora of work being done; one wonders if it is less referenced because one has to know to look for it.
At this point, my only benchmark for how authoritative or accepted the BW methods are comes from how they are cited within BW because these articles do not appear to be cross-referenced in fields outside of BW. Again, I am unsure of whether this is because of the database issue or a lack of respect for methodologies used.
As referred to in some of my earlier papers, DeGenaro and White argue: “Instead of moving toward a consensus, our researchers too often talk past each other, positions are reiterated rather than reconsidered, and we move in circles” (23). It feels like BW scholars would rather redefine an issue or reframe it in the context of their own institution or classroom context and then draw their own conclusion based on teaching experience rather than use the work that has been done in other contexts to extrapolate to their own. I see this happening at conferences too where someone’s idea will be invalidated by a comment: “Well, that won’t work for my students.”
It may feel as though I have drifted from the main questions for this paper, but for me, all of this is part of the question of methodology. If I am not 100% comfortable with the methodological choices of my discipline, must I hold rigidly to them? I distrust the trend of hypothesizing significant change based on the anecdotal evidence of one person’s classroom experience.
In previous think-pieces, scholars have observed that socioeconomic status affects access to and comfort using technology. They’ve also observed that students in Basic Writing are often the most disenfranchised. To my knowledge, no one has studied the extent to which this disadvantage exists and to what extent it impacts their writing ability. In my own classroom, I vacillate between not wanting to contribute to additional barriers to student learning and also not creating a new barrier to their success in FYC by perpetuating the digital divide in BW too.
In 2013 and 2014, I surveyed the writing classes at my institution in order to ascertain student access to technology outside of the classroom and their comfort using it. Classes were scientifically sampled for representation from all four of our campuses, as well as all four levels of our writing classes (two BW levels and two FYC levels), and the times of day for the classes. We eventually surveyed our online courses as well. The survey was given in the first four weeks of the semester and again in the last four weeks of the semester. The same questions were asked in both the pre and post surveys with the addition of a question in the post survey: Did their access to technology change during the semester and, if so, did it improve or worsen.
My preliminary findings are many. The short version is that our students who do not have easy access to the internet when off campus often find ways to achieve better access during the semester when it is a requirement of the course or there is a felt need. We still have a small percentage of students who do not have any access to the internet at all when not at school. The previous assumption was that students who did not have easy access to the internet for their assignments would get too frustrated finding time to complete their assignments on campus and would withdraw from their courses. We still have students who withdraw from all levels of writing classes, but access to technology does not seem to be a major contributing factor to their persistence. The meeting time for the course did not have a significant impact on their answers. Which campus they attended did. The three campuses that are around the same socioeconomic level ($60,000-$80,000 for average annual income) all reported easier access to the internet and computers than our lowest socioeconomic level campus, where results were dismal.
I also have data for students’ comfort levels with a variety of tasks from remembering multiple usernames and passwords to typing their papers to creating and uploading video files. These can all be cross-referenced with the different campuses and different course levels.
Essentially, I would like to do something with this information that I have, even running the surveys again in a year or two to see how the numbers have changed as students rely more heavily on their mobile devices. I would also love to run the survey at other local schools to see how the numbers change when at a public university with a strong commuter population compared to a private university with a larger percentage of residential students.
The dilemma I am having is: will this type of research be accepted and welcomed in the BW or composition journals? Or is this type of research merely rarely done in our fields because we are not generally numbers people?
DePew, Kevin. Personal interview. Skype. 2 Oct. 2015.
Hancock, Nicole. “SWIC Student Access to Technology and Comfort Using It in Class: Survey Results” Outcomes Assessment Breakfast, Southwestern Illinois College, Belleville, IL. 19 Aug. 2015. Keynote Address.