How can you carry out discourse analysis using corpus linguistics? What research questions should I ask? Which methods should you use and when? What is a collocational network or a key cluster?
Introducing the major techniques, methods and tools for corpus-assisted analysis of discourse, this book answers these questions and more, showing readers how to best use corpora in their analyses of discourse. Using carefully tailored case studies, each chapter is devoted to a central technique, including frequency, concordancing and keywords, going step by step through the process of applying different analytical procedures. Introducing a wide range of different corpora, from holiday brochures to political debates, the book considers the key debates and latest advances in the field.
Fully revised and updated, this new edition - A new chapter on how to conduct research projects in corpus-based discourse analysis - Completely rewritten chapters on collocation and advanced techniques, using a corpus of jihadist propaganda texts and covering topics such as social media and visual analysis - Coverage of major tools, including CQPweb, AntConc, Sketch Engine and #LancsBox - Discussion of newer techniques including the derivation of lockwords and the comparison of multiple data sets for diachronic analysis
With exercises, discussion questions and suggested further readings in each chapter, this book is an excellent guide to using corpus linguistics techniques to carry out discourse analysis.
Aside from experiments with Mark Davies' COCA interface and WordSmith Tools, this book was my first exposure to corpus analysis.
[As an aside: corpus analysis is a research methodology originating in linguistics. Today the methodology is associated primarily with large databases and specialized software packages that facilitate the examination of large bodies of text in order to uncover language patterns above the level of the sentence. An example of one such corpus is the Corpus of Contemporary American English (COCA), which is continuously updated and at present contains about 425 million words. Corpus analysis uses empirical methods (as opposed to armchair introspection) to develop theories of how natural language is used in specific contexts.]
I think this book works quite well for those who have little experience with the field. As the title suggests, it's also geared more towards the researcher who wants to add some quantitative methods to the usual qualitative tools deployed by a discourse analyst (and probably mostly a critical discourse analyst: Fairclough is cited with fair regularity).
The text's failings are perhaps only by design, since it is meant to be only an introduction and a how-to guide. However, for my tastes I would have preferred more theoretical framing of these methods (particularly in terms of epistemology, semiotics, or social theory). This would have helped me to wrap my mind around the kind of machine-aided pattern-matching typical of corpus analysis. (In general it seems as if the cultural/philosophical/historical significance of corpus as a method isn't touched in this book). At any rate, a brief survey of the literature on topics such as these would have been useful.
After reading this introductory textbook on innovative corpus-based discourse studies, I have concluded that modern discourse analysis remains very subjective, even if it is based on the objective means of large text corpora and quantitatively sound statistical analyses.
As the author himself states, not just once but several times in the final part of almost every chapter: "Finally it should be borne in mind that a concordance-informed discourse analysis is still a matter of interpretation. The patterns of language which are found (or overlooked) may be subject to the researcher's own ideological stance. And the way they are interpreted may also be filtered through the researcher's subject position. [...] [Additionally] an analyst identification of a discourse may not mean that the same discourse is viewed in the same way, if at all by other readers." (p. 92, italics and bold are mine)
This short excerpt seems to me to be the most extreme disclaimer for one's scienctific field!
A fine introduction to corpora and how they work. There are some good case studies of how corpora can be used in discourse analysis. However, my expectations of what the book would be were a bit different. I was hoping for something a bit meatier. I was also expecting more of a focus on critical discourse analysis rather than plain old discourse analysis (Baker himself notes difference that some see between the two, though I wasn't aware of such a difference myself) and the attention that CDA puts on ideology and power structures in society is more of what I'm after.
I stumbled upon this book by accident and picked it up since I thought it could help me out with the third dimension of Wodak's discourse-historical approach, and I'm so glad I did. This book is such a nice introduction to corpus linguistics with its many hands-on examples and step-by-step guides, and I now feel more confident using it as as a tool in my thesis.
This book is an excellent introduction to discourse analysis using corpus methodologies. It is organized based on how a corpus can be analyzed, mainly focusing on techniques like frequency, collocation, keywords and concordances. Each of these techniques are illustrated using an applied research topic, which makes the book a guide through practice.
I read this book for my Corpus Analysis class. It was the better of the two texts we had for this class. It is a good introduction to the tools of corpus based discourse analysis, and the example analyses were demonstrative and overall helpful. Not a book I would recommend unless you are taking a class on the subject, but good for what it is.
A good book that shows the different techniques that corpora can be used to investigate different issues such as frequency and dispersion, collocates, keyness, nomalization, modality, metaphor, etc.