What are the challenges of doing DBR?
Although DBR is gaining ground in the field of education and is seen by many as a powerful research paradigm to improve educational settings, several challenges need to be carefully considered and monitored by anyone who is interested in using DBR as a methodology in research. This section outlines several issues as addressed by the Design-Based Research Collective [DBRC] (2003), including credibility of data, generalizability, and collaborative partnership. In addition , we extend the discussion by supplementing some other practical challenges , such as sustainability, funding, publication and IRB issues.
Objectivity, reliability and validity are traditional criteria for ensuring the credibility of research data. It is especially true for scientific and experimental studies. Objectivity can be interpreted in two aspects: one is about the neutrality that researchers keep ¡°context free¡± to prevent perturbing the targets being measured; on the other hand, researchers avoid subjective interpretations of data because ideal research should be designed in a way that the data speak for themselves. Reliability stands for the consistency of a set of measurements while validity means that measurements are measuring th e qualities that are intended to be measured.
Applying this traditional view of credibility to design-based research is a great challenge because design-based research does not strive for ¡°context free claims.¡± Rather, it sees context as central to its conceptual terrain (Kelly, in press). Researchers conducting design-based research usually, if not always, need to immerse themselves in the research context and intensely interact with participants. As a consequence, it is difficult to keep being objective and neutral; instead, it can bring about the so-called ¡°Hawthorne Effect ,¡± as mentioned in Brown (1992) , an effect that participants react to researchers¡¯ expectations. In addition, as design-based research uses both qualitative and quantitative research methods, such as observation and interview, the ways of measuring or interpreting reliability and validity do es not fit well with traditional quantitative studies.
So how do we deal with this challenge? The Design-Based Research Collective [DBRC] (2003) suggested that to promote objectivity, design-based researchers need to regularly shift between the roles of advocate and critic to eliminate bias and subjectivity. In particular, triangulated data collection methods should be utilized to document the processes of enactment. To enhance reliability, DBRC (2003) again suggested using triangulated data collection methods, plus repetition of analysis across cycles of enactment, and use (or creation) of standardized measures or instruments. As for validity, DBRC (2003) stated that the validity of design-based research data lies in the alignment of theory, design, practice and measurement, which can be strengthened over time through collaborative partnership and iterations of the design trajectory.
Generalizability refers to the extent to which one can extend the account of a particular situation or population to other persons, times or setting than those directly studied (Maxwell, 2002). It is of concern because most researchers aspire to extending the application of their research results to have a broader impact on the society in contrast to confining the research within a specific research site. In contrast to those studies that apply pure quantitative methodology, design-based research finds generalizability much more challenging because of its highly contextualized research agenda and its heavy reliance on thick description for data analysis.
There are different perspectives that explain generalizability issue s of design-based research. One is that studies adapting qualitative methodology, as the case of design-based research, focus on digging themes and patterns of the specific research site instead of exploring the possibility of generalizability. In other words, generalizability is sometimes simply ignored in favor of enriching the local understanding of a situation. Another perspective regards generalizability as achieved by the reader of the research report. Design-based researchers should provide data in sufficient detail for readers to fully grasp the meaning of the research and apply it to relevant situations by themselves.
Gravemeijer and Cobb (in press) provide another perspective. They argue that generalizability of design-based research can be achieved by framing exemplary or paradigm cases. Gravemeijer & Cobb (in press) said: We touched on the issue of generalizability when discussing the importance of viewing classroom events as paradigm cases of more encompassing issues. It is this framing of classroom activities and events as exemplars or prototypes that gives rise to generalizability...the theoretical analysis developed when coming to understand one case is deemed to be relevant when interpreting other cases. Thus, what is generalized is a way of interpreting and understanding specific cases that preserves their individual characteristics. (p.79) The authors imply that it is the constructs, local theories and design principles derived from iterations of design research that form the basis of generalizability of design-based research.
For a typical design-based research, it usually takes years to complete the whole process. Unexpected factors, such as participant wear out, or loss of motivation make design-based research a challenging endeavor. In addition, design-based research draws on multi-disciplinary expertise, as Sandoval and Bell (2004) stated: ¡°On the research side¡, design-based researchers draw from multiple disciplines, including developmental psychology, cognitive science, learning sciences, anthropology, and sociology. On the design side¡, researchers draw from the fields of computer science, curriculum theory, instructional design and teacher education¡¡± (p.200 ). Thus, it is pivotal that design-based researcher maintain a good collaborative partnership with various stakeholders was well as among the research ers themselves.
In order to maintain a collaborative partnership with the stakeholders, researchers have to scrutinize the social, cultural, psychological, as well as political dynamics in situ and outline the needs of all sides. Building on the in-depth understanding of the stakeholders in the context, the design-based researchers are able to create and maintain conditions in which mutual benefit, understanding, trust, and support spread through the whole research process. For example, if teachers are given more autonomy and responsibility by serving as co-designers, they may be more accountable for their design work and be less resistant of the changes inevitably required as the design-based research project evolves.
For the research team, Burkhardt (in press) stated that it is important to have stable design teams of adequate size to grapple with large tasks over the relatively long time scales for both research and development. Following Burkhardt¡¯s advice, the team should gauge required design and research expertise in each phase in order to recruit adequate team members as well as appropriately assign division of labor. Mutual training and knowledge sharing is also helpful for team members to leverage the overall knowledge base as well as to mak e sense of each others ¡¯ work. Finally, documenting research progress and major events within the research group is helpful for communicat ing with each other and reducing possible misunderstanding.
Another challenge of design-based research is sustainability. Brown (1992) stated that many new methods were eventually abandoned by teachers due to the absence of support from the researchers. Therefore, it is very important that the practitioners be self-motivated and self-sustained to be able to maintain, or even improve the design innovations in the long run.
ne way to achiev e th e goal of sustainability is to bring teachers in as co-designers or co-researchers. The benefits are: 1) Teachers will be more committed and responsible for utilizing the designed innovations; 2) Teachers will develop essential knowledge and skills to stand against unexpected environmental changes without the need of outside support. 3) Teachers will be more likely to appreciate the value of design-based research at the methodological or philosophical level, instead of merely adopting the designed activities. Another way to facilitat e sustainability is establishing learning communities of teachers . By doing so, t eachers can exchange ideas and learn from each other¡¯s experience; in particular , peer teaching and mentoring can be powerful means for teachers to quickly learn new instructional strategies and products , as well as create a sense of belongingness that motivates teache rs to keep utilizing the designed innovations.
From the organizational or institutional point of view, Billig, S. H., Sherry, L., & Havelock, B. (2005) provided a framework for sustainable innovations including nine essential elements. Here we just outline the nine essential elements without going in detail: 1) strong leadership; 2) strong infrastructure; 3) support structures; 4) incentives to draw people and maintain them in system; 5) visibility of project; 6) credibility for success; 7) strong and mutually beneficial partnership; 8) macroculture development to help identity and contextual relevance and 9) Sufficient funds from multiple sources.
Given that design-based research is viewed by many as a new and different educational research paradigm with the features of longitudinal, iterative, multipurpose of theoretical and practical improvement, and mixed data collection methods, it is very challenging to get design-based research projects funded or published. To mitigate this problem, design-based researchers need to ¡°show-off¡± the various initiatives being done using the design-based research paradigm.
For example, it is helpful to present exemplary, prototype studies reflecting the classic applications of design-based research. This provides a ¡°roadmap¡± for potential stakeholders or advocates. Some good exemplars can be seen in the earlier section on this page. Moreover, standard building is necessary for communicati ng with the broader research community, including journal review boards, academic institutions, funding sources and general public. As Collins, Joseph and Bielaczyc (2004) stated: ¡°if design research is to become accepted as a serious scholarly endeavor, the learning-sciences community needs to take responsibility for creating standards that make design experiments recognizable and accessible to other researchers¡± (p.16 ) . It is an immediate goal that design-based researchers should take.
The iterative and longitudinal nature of DBR makes it difficult to present a clear-cut, start-end picture to IRB reviewers. In addition, DBR requires flexibility, which means that researchers may often modify or change interventions and even composition of participants during study. This adds difficulty justifying the research process and potential benefits of study for IRB reviewers. Finally, the intensive interactions between researcher and participants, plus the rigorous interventions in the context also increase the difficulty of obtaining IRB approval.
Although the IRB process varies across academic institutions, generally it is helpful to split the big entity of the project into multiple sections for IRB submissions. The design-based researchers may submit the central ideas at the first time with anticipated variations of the project to get IRB approval of the first trajectory. Then researchers may submit amendment forms to get the project re-approved at later times.
Billig, S. H., Sherry, L., & Havelock, B. (2005). Challenge 98: sustaining the work of a regional technology integration initiative. British Journal of Educational Technology, 36(6), 987–1003.
Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141-178.
Burkhardt, H. (in press). From design research to large-scale impact: Engeerning research in education. In van den Akker, J., Gravemeijer, K., MeKenney, S. & Nieveen, N. (Eds.), Educational design research (pp. 185-228). London: Routledge.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15-42.
Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5-8.
Gravemeijer, K., & Cobb, P. (in press). Outline of a method for design research in mathematics education. In van den Akker, J., Gravemeijer, K., MeKenney, S. & Nieveen, N. (Eds.), Educational design research (pp. 45-85). London: Routledge.
Hammersley, M. (1992). What’s wrong with ethnography? London: Routledge.
Kelly, A. (in press). Quality criteria for design research: Evidence and comments . In van den Akker, J., Gravemeijer, K., MeKenney, S. & Nieveen, N. (Eds.), Educational design research (pp. 166-184). London: Routledge.
Maxwell, J.A. (2002). Understanding and validity in qualitative research. In Huberman, A. M., & Miles, M.B. (Eds.), The qualitative researcher’s companion (pp. 37-64). Thousand Oaks: Sage.
Sandoval, W.A. & Bell, P. (2004). Design-based research methods for studying learning in context: introduction. Educational Psychologist, 39(4), 199-201.
@ Peer Group 2006