Is a Rose by any Other Name, Still a Rose? Why Knowledge Translation and Implementation Science are not Synonymous

The language we use for concepts is important because common understanding shapes our world. By avoiding confusion in nomenclature, we better understand the world around us, and as researchers or practitioners, we deepen our knowledge base and move it forward. Shared language helps us communicate more effectively and is critical to collaboration and co-creation, which in turn are critical to science (Thomas & McDonagh, 2013).

I have been teaching knowledge translation (KT) and implementation science for almost 15 years. In that time, I have arrived at a comfortable clarity about the meaning of the term knowledge translation. Here it is.

Knowledge Translation Defined

Simply put, knowledge translation is the practice of communicating research evidence using processes and strategies that ensure the evidence can be accessed and understood in a manner that can benefit a range of knowledge users, both within and beyond academia, as appropriate. Knowledge translation is best conceptualized as an umbrella term that encompasses several subspecialties that are not exclusive to health research, including: dissemination; practice, behaviour, or policy change; knowledge management; and commercialization and technology transfer. The term translation(al) science is more rarely used today but refers to the translation of basic science to clinical application; one part of the translation continuum.definition bucketsWhen we seek to change practice, behaviour, or policy, we enter the subspecialty of implementation science, defined as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services and care” (Eccles & Mittman, 2006). Note that the emphasis on the purpose or KT goal is “to promote the systematic uptake”. This presumes the research evidence we are sharing has instrumental use and is ethically ready for application and scale up. Implementation science does not capture the entire spectrum of knowledge translation goals and activities, and hence, the terms are related but not synonymous.

Up until very recently, the empirical health literature has used the term “KT strategy” in specific reference to strategies directed at practice change, giving rise to the confounding of the two terms (e.g., Scott et al., 2012). The same is evident in models of practice change (e.g., Graham et al., 2006). The possibility that we might engage in knowledge translation for purposes other that practice change was lost in translation. The more recent upsurge of implementation science literature has created greater clarity, however, with references to implementation strategies (see Powell et al., 2012; Powell et al., 2015) and implementation models and frameworks (see Nilsen, 2015).

Whether you prefer the term knowledge translation or knowledge mobilization does not matter as both terms encompass the processes and strategies involved in achieving KT goals. Using the term knowledge translation synonymously with implementation science implies that practice, behaviour, or policy change is the only goal for sharing research evidence. This is simply not the case, if only for the simple reason that a great deal of KT activity in practice and science is firmly rooted in achieving awareness building and informing, and very little research is ready for instrumental use (Amara, Ouimet, & Landry, 2004). Instrumental use involves applying research results in specific, direct ways. Conceptual use involves using research results for general enlightenment; results influence actions but more indirectly and less specifically than in instrumental use. Symbolic use involves using research results to legitimate and sustain predetermined positions (Beyer, 1997).

Components of an Effective KT Approach

Effective knowledge translation begins with understanding the evidence and the meaning and benefit that others may derive from it (main message; what is being communicated). Effective knowledge translation also requires attention to the language used to communicate the evidence (plain language), knowledge user preferences regarding the format in which it is shared (e.g., oral, written, visual; how the message is communicated), and the channel through which it is delivered (e.g., radio, journal, social media, in person, newsletter; how the message physically gets transferred). Lastly, we must consider the strategy(ies) needed to achieve the KT goal (e.g., webinar, reminder, patent, opinion leader), and evaluate whether the KT goal was achieved (indicator or metric).

Begin with a KT Goal

The strategies we use to convey the ‘main messages’ of research findings are related to the purpose of the communication. Like any communication, knowledge translation always has at least one purpose or knowledge translation goal. Our everyday communications have a purpose; to inform, declare, emote, inspire, instruct, etc. When we share knowledge emerging from research (evidence-informed knowledge), we share it with a purpose that is tied to context.KT goalsThe context surrounding the knowledge translation communication involves what we reasonably know from the evidence, being mindful of its’ potential use (conceptual, symbolic, instrumental), that it is shared ethically, and with awareness of how the knowledge user may access, understand, and benefit from it.

Knowledge translation goals include: building awareness and interest; informing research to build the scientific knowledge base; informing decision-making; facilitating practice, behaviour or policy change; and/or facilitating commercialization or technology transfer. When we share research evidence, it is with at least one of these goals in mind.
To be successful in achieving practice, behaviour, or policy change first requires engaging in adequate sharing of evidence to inform, build awareness, shift attitudes (create buy in), and inform decision-making. Not all evidence shared for building awareness or informing is ready for application, but this does not make it less beneficial to the knowledge user.

Knowledge translation exists on a continuum; each KT goal within the continuum is tied to the strength and potential use and/or benefit of the evidence to be shared. It is entirely appropriate to share what has been learned from a single study, as we do when we publish in a peer-reviewed journal or present at a conference. Doing so ethically requires us to be mindful of the limits of what we know, how it may be interpreted, and how it may benefit knowledge users. However, when we intend for research evidence to drive practice, behaviour, or policy change, we must hold the evidence to a higher standard for strength, quality and rigour. Engaging with the purpose of changing how someone behaves requires strong evidence that has been replicated in high quality studies; systematic reviews help us to identify this evidence.

As in any scientific field, there is an evolution of understanding and advancement of concepts. The field of knowledge translation has evolved considerably, and continues to do so. Perhaps some clarity on terminology will help us all to move in unison, across the sectors and disciplines that seek to extend the impact of new knowledge.
Note: This understanding of knowledge translation terminology is available in multiple formats, including the Knowledge Translation Planning Template (Barwick, 2008, 2013) and two e-learning modules (see references).


Barwick, M. (2008, 2013) Knowledge Translation Planning Template.  Toronto ON: The Hospital for Sick Children. Available from:

Barwick, M. (2016). Building Scientist Capacity in Knowledge Translation: Development of the Knowledge Translation Planning Template. Technology Innovation Management Review, 6(9): 9-15

Barwick, M., Filipovic, S., McMillen, K., Metler, S., & Warmington, K. (2017) Working with the KT Planning Template. E-learning module.

Barwick, M., Filipovic, S., McMillen, K., Metler, S., & Warmington, K. (2017) Introduction to knowledge translation. E-learning module.

Eccles, M. P., and Mittman, B. S. (2006). Welcome to implementation science. Implementation Science, 1(1), doi:  10.1186/1748-5908-1-1.

Graham I, Logan J, Harrison M, Straus S, Tetroe J, Caswell W, Robinson N. (2006). Lost in knowledge translation: time for a map? Journal of Continuing Education in the Health Professions, 26, 13-24. 10.1002/chp.47.

Nilsen, P. (2015). Making sense of implantation theories, models, and frameworks. Implementation Science, Apr 21;10:53. doi: 10.1186/s13012-015-0242-0.

Powell, B. J., McMillen, J. C., Proctor, E. K., Carpenter, C. R., Griffey, R. T., Bunger, A. C., … York, J. L. (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69(2), 123-157. DOI: 10.1177/1077558711430690

Powell, B. J., Waltz, T. J., Chinman, M.J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., Proctor, E.K., Kirchner, J.E. (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, Feb 12;10:21. doi: 10.1186/s13012-015-0209-1.

Scott, S.D., Albrecht, L., O’Leary, K., Ball, G.D., Hartling, L,, Hofmeyer, A., Jones, C.A., Klassen, T.P., Kovacs Burns, K., Newton, A.S., Thompson, D., Dryden, D.M. (2012). Systematic review of knowledge translation strategies in the allied health professions. Implementation Science, Jul 25;7:70. doi: 10.1186/1748-5908-7-70.

Thomas, J., and McDonagh, D. (2013). Shared language: Towards more effective communication. Australasian Medical Journal, 6(1):46–54.

[1] A rose by any other name would smell as sweet” is a popular reference to William Shakespeare’s play Romeo and Juliet, in which Juliet seems to argue that it does not matter that Romeo is from her family’s rival house of Montague, that is, that he is named “Montague”.

NOTE: This blog post appears on the KNAER Blog as an invited submission.

Fundamental Considerations for the Implementation of Evidence in Practice

Implementation science is the scientific study of methods that support the adoption of evidence based interventions into a particular setting (e.g., health, mental health, community, education, global development).  Implementation methods take the form of strategies and processes that are designed to facilitate the uptake, use, and ultimately the sustainability – or what I like to call the ‘evolvability’ – of empirically-supported interventions, services, and policies into a practice setting (Palinkas & Soydan, 2012 ; Proctor et al., 2009); referred to herein as evidence-based practices (EBPs).

The National Implementation Research Network refers to implementation as a specified set of activities designed to put into practice an activity or program of known dimensions.  NIRN makes the point that implementation processes are purposeful and are described in sufficient detail such that independent observers can detect the presence and strength of the “specific set of activities” related to implementation.  The intent is that the activity or program being implemented can be described in sufficient detail so that independent observers can detect its presence and strength and its implementation can be replicated in similar contexts.

Implementation is commonly considered one of the last stages of the intervention research process that follows the results of effectiveness studies. It sets out to explore the role of context, how best to prepare for evidence adoption and scale up, the practical considerations of implementation, and how best to facilitate sustainability.

is-continuum-imageImplementation focuses on taking interventions that have been found to be effective using methodologically rigorous designs (e.g., randomized controlled trials, quasi-experimental designs, hybrid designs) under real-world conditions, and integrating them into practice settings (not only in the health sector) using deliberate strategies and processes (Powell et al., 2012 ; Proctor et al., 2009; Cabassa, 2016).  Hybrid designs have emerged relatively recently to help us explore implementation effectiveness alongside intervention effectiveness to different degrees (Curran et al,  2012).

Cumulative experience and evidence in implementation science suggests several fundamental considerations for success. What follows are my own musings about what these fundamental considerations are, and these will likely evolve as I progress in my own research and follow the work of my implementation colleagues worldwide.

building-blocksFundamental Considerations for Evidence Implementation

  1. The implementation of empirically supported interventions or practice innovations is a dynamic social process.  One does not manage this in isolation of colleagues, partners, stakeholders, leaders or champions.
  2. Implementation is shaped by the context in which the practice innovation takes place, by the people involved in this process (those who provide the new intervention and those who receive it), the characteristics of the intervention, and the characteristics of the inner and outer systems (see Damschroder et al., 2009).
  3. Implementation unfolds over time through stages, requiring transformation of the practice context and, often, some degree of adaptation of the innovation (Cabassa & Baumann, 2013). Transformation of the setting (readiness or preparation) takes time, to build the organizational conditions for practice change success before training takes place, and when this is disregarded (as it often is), it leads to unsuccessful change.  System change has implications for leadership expectations in rolling out and scaling up EBPs across a system. Organizations and individuals reach absorptive capacity for implementation, and so timelines and system expectations need to be paced accordingly.
  4. One must consider the costs of implementation and allow organizations and system to budget for the changes that will take place, often over several years including and beyond the initial implementation.  The context in which implementation occurs has important implications for cost and sustainability.  Consider who is driving the change and paying for it: a research grant, an organization, a government, or a purveyor.
  5. Adaptation of EBPs requires knowledge of the active ingredients of the intervention and fidelity (competence and adherence) to these elements.
  6. Implementation often requires co-creation (see Metz 2015), involving the interaction, collaboration, and participation of stakeholders or knowledge users at multiple levels of an organization and system of care (Aarons, Horowitz, et al., 2012). Implementation teams require the engagement of organizational leaders, directors, managers, administrators, service providers, frontline staff, clients, and their family members, as implementation entails a multitude of social processes, including planning, decision making, negotiating, prioritizing, problem solving, service delivery, restructuring, and the allocation of resources (Cabassa 2016). Implementation complexity calls for greater social interaction and involvement of stakeholders who can facilitate the process and can bring knowledge and expertise about the intervention, and locally grounded knowledge, skills, and understanding about the settings and communities in which the intervention will be used. Implementation science is a collaborative endeavor (Cabassa 2016).
  7. Implementation is inherently about change management and involves a new way of doing things within an organization or system.It overlaps with but differs in important ways from quality improvement (see Bauer et al, 2015). Implementation involves a change in the status quo that requires adaptations and adjustments in attitudes, social norms, practices, procedures, work flow, behaviors, and policies. The change process is is guided by a combination of project management and implementation strategies and processes  (Powell et al., 2012).
  8. Implementation of evidence-based care/practices fundamentally involves two content areas: the evidence related to the innovation and the evidence on implementation science. Relatedly, this means that evaluation of implementation efforts and indicator tracking fundamentally requires evaluation of the innovation target (i.e., health outcome) and implementation outcomes (see Proctor et al., 2012).
  9. Implementation science is an emerging and dynamic field. Current best evidence about IS can be instructive but it is a moving target, and we are still learning a range of things in this field, including:
    1. What constructs/factors are important to consider for implementation success
    2. How to measure these key factors
    3. What processes work best in what contexts and for what types of evidence
    4. Identify the best research methods for IS and train health scientists
    5. Optimal reporting of IS research (StaRI Guidelines); important so we can learn what works (See BMJ Open, in print, Pinnock et al; two papers)
    6. Development of methods/measures for tracking and measuring implementation costs
    7. Research what implementation success looks like, including expectations of sustainability (and what I call “evolvability”)
    8. How best to measure implementation outcomes
    9. Develop resources and tools to support implementation, done independently or with technical assistance.
  10. There is growing global expertise in implementation practice and science, as manifest in the Global Implementation Initiative, the Global Implementation Conference, the Society for Implementation Research Collaboration, and  jurisdictional implementation communities.
  11. We need to build capacity for implementation within systems and within organizations, and this includes building organizational competencies and a workforce of implementation specialists (requiring the development of requisite training programs and curricula). Some thinking is occurring around this issue, such as the Global Implementation Society hosted by the Global Implementation Initiative.
  12. Implementation of evidence involves embedded implementations – at the clinical level, organizational, and systems level. This means implementing change at all levels (i.e., new ways of doing business, changing processes, building transformative leadership and organizational conditions for change), not only implementing the evidence in practice.
  13. Implementation is not a ‘make it so’ proposition, but rather is complex and time consuming (varying time and complexity for different situations), and it is in our best interest to recognize these challenges whilst trying to fulfill political and real world expectations.
  14. Implementation is not a one size fits all enterprise. We should not expect to arrive at a menu of implementation theories/frameworks, approaches and/or strategies that can be mapped to sector, context, health issue, or population. It will always be an iterative process.
  15. There are likely some factors that are universally important for implementation success across contexts (i.e., heath, mental heath, community, global health, education), and some that are unique to  a particular setting.  Research is working to parse this in a way that can help us to better plan for an engineer successful change (Barwick 2016; and Barwick et al submitted).
  16. Implementation planning and ultimately its success will depend, in part, on the implementation context or the mechanisms driving the implementation. Whether the implementation initiative is driven by the motivation of the organization, government, funder, EBP purveyor, or research study likely matters for planning, costs, and most especially for sustainability (Barwick 2016; and Barwick et al., submitted).

As mentioned above, this list of implementation fundamentals is ‘evergreen’, and will evolve over time.  Please feel free to contribute a comment; what is missing, and is there face validity to those identified in this list?


Barwick M. (2016). The Consolidated Framework for Implementation Research: Comparison across contexts. Paper presented at he 3rd Biennial Australasian Implementation Conference, Melbourne Australia, October 6th 2016.

Barwick M, Kimber M, Akrong L, Johnson S, Cunningham CE, Bennett K, Ashbourne G, Godden T. (Submitted Nov 3 2016, PLOS ONE). Evaluating Evidence-Informed Implementation: A Multi-Case Study of Motivational Interviewing in Child and Youth Mental Health.

Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. (2015). An introduction to implementation science for the non-specialist. BMC Psychology 2015,3:32.

Brownson RC, Colditz G, Proctor E. (Eds.)  (2012). Dissemination and implementation research in health: Translating science to practice. Oxford: Oxford University Press.

Cabassa, LJ.  (2016).  JOURNAL OF SOCIAL WORK EDUCATION, 2016, VOL. 52, NO. S1, S38–S50,

Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C (2012). Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care, 50: 217-226. 10.1097/MLR.0b013e3182408812.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, et al. (2009) Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 4: 50. DOI: 10.1186/1748-5908-4-50.

Metz A. (2015). Implementation Brief: The potential of co-creation in implementation science. Chapel Hill NC: NIRN.  Accessed January 12th 2017 from

Palinkas IA & Soydan H. (2012). New horizons of translational research and research translation in social work. Research on Social Work Practice, 22, 85-92.

Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths C, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, Taylor S. (In press, January 11 2017). Standards for Reporting Implementation Studies (StaRI) Explanation and Elaboration Document. BMJ Open

Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths C, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, Taylor S (In press, Oct 25 2016). Standards for reporting implementation studies (StaRI) Statement. British Medical Journal.

Powell BJ, , McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, York JL. (2012). A compilation of strategies or implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123-157.

Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C & Mittman B. (2012). Implementing research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Research, 36, 24-34.



Knowledge Translation and Strategic Communications: Unpacking Differences and Similarities for Scholarly and Research Communications

We’ve just published an article looking at the differences and similarities between KT and communications!  This is often a hot topic.This issue has come up a lot, at our Scientist KT Trainings and in conversation with KTPs and researchers.  The boundaries between the two professions are somewhat strained, and so we hope this analysis will continue the conversation toward a more flexible state of affairs in organizations that employ both KTPs and communications professionals.  We hope you enjoy the piece!  Please feel free to comment and share your point of view.

Knowledge Translation and Strategic Communications: Unpacking Differences and Similarities for Scholarly and Research Communications
Melanie Barwick, Hospital for Sick Children
David Phipps & Michael Johnny, York University
Gary Myers,  KMbeing
Rossana Coriandoli,  EENet, Centre for Addiction and Mental Health

Citation: Barwick, Melanie, Phipps, David, Myers, Gary, Johnny, Michael, Coriandoli, Rossana. (2014).  Knowledge Translation and Strategic Communications: Unpacking Differences and Similarities for Scholarly and Research Communications. Scholarly and Research Communication, 5(3): 0305175, 14 pp.

‘Single Most Important Things’ from Australia


It’s mid-October, and I’m home after five weeks in Australia, touring the country and giving talks, workshops, and meeting the people. Australia is a vast, beautiful and wondrous country, and requires countless more hours of exploration. I am committed to returning; to explore the land, to learn more from its peoples, and to revisit the people with whom I had the privilege of meeting – to learn what they were able to develop from our conversations and from the stories and evidence I shared with them. As in any good knowledge translation effort, the bottom line is impact – demonstrating that people knew what to do with the knowledge you shared.

I spent most of my time in Perth, a lovely city that sits perched on the Swan River in Western Australia. I visited Sydney in the eastern state of New South Wales, Broome in the northern Kimberley region, and Albany, on the south coast. Each place has its’ own beauty, treasures, and cultures.

I spoke about my own research and my implementation journey over the last 14 years. I shared new thinking about the role of KT in academic promotion, about what counts as “scholarly” in 2014, and how many North American universities have re-envisioned the modern day scholar and the role of the university in today’s society. I met and spoke with senior university leaders, senior managers in health and mental health care, and frontline service providers in health and mental health in urban and rural areas. At every occasion, audiences seemed poised to support systemic changes that would facilitate a new era of KT activities, funding and policy shifts, and scholarly recognition for this work across Australia. Canada is a great model for such a paradigm shift, having worked and developed funding, policy, research and practice initiatives to support KT science and practice for the better part of a decade or more. Here’s a summary of my impressions and what I hope to have left behind.

Invest in system wide outcome measurement, something that has been accomplished in many countries and jurisdictions. Without this, there is no basis for tracking change, and thus, no way of systematically assessing whether changes to evidence based practices are having the desired effects system wide. This is highly feasible. One need only look to models of success in Ontario, Michigan, New York, etc., plan on the change, and implement it.  Data systems are the backbone of service improvement, and it’s a huge oversight to try to improve systems through policy change or implementing evidence-based care without first building the infrastructure to support these efforts. Once analyzed, system level data needs to be shared back with the organizations that contributed the data in the first place; we need to close the loop. This encourages compliance, reflection, and utilization of data in service delivery. In the end, it’s about getting better outcomes, not about defending a program that you like or providing treatment that you believe works without actually going the extra step to assess your fidelity and outcomes.

Use data. Many organizations, including health service organizations and government departments, understand they need data.  And, for the most part, many of them collect data about service or system outcomes.  However, many don’t appear to use the data they collect in a meaningful wayUsing data to improve services needs to be a ‘deliberative process’.   If we are to achieve optimal utilization of the data we go to great lengths to collect, we need to be deliberate in our approach, and this requires us to change how we use our time, which means we must stop complaining that we don’t have the time.  How we allocate our time in health services is a feasible change.  In the same way that implementation of evidence-based practices requires time for planning and reflection, so too, does the deliberate use of administrative data.  On the policy side, government access to evidence is not the only issue; they must review and reflect on the data they receive, and share back what they learn. The data won’t tell a story on their own; data needs to be reviewed and reflected upon by those overseeing and providing services.  At present, much administrative data sit gathering dust, like many of our reports, and represent unrealized potential for change. In the words of Atul Gawande, count something, write it down, (reflect upon it), and tell someone about it. Look for opportunities for change.

Let go the ‘cherished notions’. Service providers need to drop the services that have no evidence to support them; their cherished notions.  They can only know whether their services are effective if they examine their data, and equally, if they collect their outcome data on a routine basis to begin withIt is time to heighten our accountability to the children and families we serve, and to be honest about the effectiveness of the services we provide.  For their part, recipients of mental health and health services need to ask their providers for evidence that their services work.  We do no less when we to go to hospital for treatment, or when we purchase other services.  Somehow, this question is not being asked by those seeking services for their kids’ mental health.  We need to empower them to do so.

Fund partnerships and change academic currency. Implementing evidence in practice requires partnerships between academia, government, and service delivery. For these partnerships to occur they have to be recognized as relevant by health funders and as scholarly and worthwhile pursuits for academics, and in many of Australia’s universities, they are not.  There are many universities, particularly in North America, that have adopted new formats and philosophies related to academic promotion, and have incorporated a new vision of scholarship that recognizes community engaged scholarship.  Australia needs to turn to these models and start the conversation for change.

Change how and what funders fund. A related and important driver for change is the role of the health research funder.  In Canada, the Canadian Institutes for Health Research has been an important driver for growth in KT and implementation science by way of tailored RFPs for KT science that specifically request joint research leadership between nominated principal applicants and nominated principal knowledge users, and by allowing for KT and implementation activities in research budgets (not only for travel and open access publications). Other provincial funders, including Alberta Innovates Health Solutions, Michael Smith Foundation for Health Research, and the Nova Scotia Health Research Foundation have followed suit to change their own funding mechanism and processes and to offer capacity building in KT, in partnership with SickKids and other collaborators. Australasian funding bodies can better support partnership and KT, including enabling this work to be resourced through their grants, beyond the discovery stage. This means appreciating the multiple forms of partnership, engagement, and KT strategies that better exemplify the scope of activities designed to create real world impacts from the science it funds.  Simply counting productivity outputs (grant funding, publications) does not encourage or capture the impact stories, nor demonstrate the real world impacts that have resulted from funding. These changes are low lying fruit; they are entirely feasible and need to be acted upon if Australia is to realize real change in practice stemming from effective KT and partnerships.

The impacts from my Australian journey are blossoming and will reveal themselves in time. Every new connection, every conversation, provided an opportunity for knowledge exchange, for uncovering similarities and differences, and the potential for further collaboration, outcomes and impact. Each meeting became part of my implementation journey, and via this account, part of the collective narrative of our global implementation journey.   There is some positive movement in these important areas.

Build organizational and scientist capacity for KT. Building KT capacity is essential if Australia is to move towards real world impacts for the science it funds and the services it provides. I ran a well-attended Scientist Knowledge Translation Training workshop for about 30 people across a range of health disciplines, sponsored by the Institute of Advance Studies and the School of Public Health at UWA – a big thank you to Susan Kateo at IAS and Dr. Lisa Wood at UWA. Senior university leaders from several universities heard me speak on new paradigms for academic promotion. They were very receptive to the ideas and models shared, and several are motivated to lead the charge. There is a commitment, expertise (KT Australia), and mechanisms in place to host professional development workshops in KT. A partnership between SickKids and KT Australia to provide the Scientist Knowledge Translation Training™ across Australia is in the making, with a view to realizing our first workshop in Fall 2015.

Change how and what funders fund. The funders with whom I met (i.e., Healthway, Cancer Australia, Department of Health) shared their thoughts on what they might do differently and demonstrated a readiness to explore how they might shift their funding processes and competition streams to encourage partnership, KT activity, and a focus on implementation and real world impacts. They view themselves as ‘small’ funders, but are open to the idea that they can, in their own right and through collaborative support, start a new movement. I remain hopeful about their vision and commitment, and have linked them with my contacts at AIHS and MSFHR, Canadian provincial funders who can provide coaching and models from their excellent work promoting KT and building capacity.

Build on KT successes. I met with several research teams: Dr. Hugh Dawkins and his team at Genome Australia; Drs. David Lawrence and Donna Cross and colleagues at Telethon Kids Institute; Dr. Lisa Wood and colleagues in the School of Public Health at UWA; and Dr. Fiona Bull and her team at the Centre for the Built Environment at UWA. Many of them are already working in this space, and they were receptive and eager expand their KT applications in their work, mindful that they need organizational supports and KT experts who can support the development of their KT skill set. There have been some false starts in some places, so perhaps my visit will reignite an awareness and commitment to move forward in knowledge translation within academic institutions. Shifts will need to happen in funding and promotion for this to really take hold in any meaningful way.

Explore innovative KT strategies to engage youth and share evidence. The practitioners in health and mental health services were welcoming of the KT perspective I shared, and whilst they juggle numerous difficulties in delivering services in geographically complex settings with hard to reach populations, there are some great innovations brewing. Youth counselors and clinicians at Alive and Kicking Goals in Broome are piloting a new software application called “I Bobbly” developed by the Black Dog Institute, to address the staggering rate of suicide among aboriginal youth – four times the national average. The I-bobbly smartphone app has been designed specifically for Aboriginal people living in remote communities, and the approach, according to Alive and Kicking Goal’s Joe Tighe, might be able reach people that traditional suicide prevention strategies aren’t.

The feasibility of using apps with this hard to reach population was also discussed with David Wild and Darren Grassick at Headspace, Australia’s National Youth Mental Health Foundation, also in Broome. Headspace is a national initiative with a focus on the mental health and well-being of Australians. As with many regional and national initiatives, collecting standardized data about service delivery is challenging, a sentiment shared by those providing environmental health services for the Broome area, who spoke of the “clunkiness of common metrics for collecting evidence of health outcomes and service contacts”. Standardized questionnaires delivered with iPad technology appear to offer a feasible solution for improving engagement with youth and hard to serve populations.

David and Darren spoke of the importance of context for assessing symptoms of mental health disorders, noting that some of the indigenous youth who manifested symptoms of ADHD in local schools were absent of these encumbrances when living in their native communities. We spoke about the implications of such revelations for capturing functioning, symptoms, and outcomes across systems of care, and the need for policy makers to become more aware of the importance of context and the challenges of applying state wide measures in complex environments. Policy makers need to work with service providers to develop better ways of collecting data – of capturing both snapshots of client functioning as well as process outcomes. At a minimum, providers can report on who receives service, which may present a minimum data set when tracking how kids improve as a result of the services they receive proves contextually challenging. David and Darren committed to bringing these issues for discussion at the upcoming Headspace meeting in Alice Springs in November, and I hope to follow up with them to continue the discussion.

Janice Forrester, coordinator for the local Rheumatic Heart Disease Service in Broome, shared information about their service, and the challenges they experience in sharing important information about RHD with their young patients and their families. RHD include heart disorders that can occur as a result of rheumatic fever, often resulting in heart valve damage. Her interest lay in learning how the service could better meet the needs of the regional population, and how they could more effectively disseminate key information about the disease, its treatment, and complications. If heart damage from rheumatic fever is identified in childhood or young adulthood, daily antibiotics may be required until the age of 25 or 30, to help prevent recurrence of rheumatic fever and avoid the development of infective bacterial infection of the heart valves or lining of the heart. Noting that their use of pamphlets or one-on-one consultation wasn’t always effective for knowledge translation, we discussed more engaged strategies they could use to share evidence about RHD, both to educate clinical staff and to empower families to identify the disease, seek out treatment, and comply with very long treatment regimens. Our collective brainstorming identified the potential for local indigenous ‘yarning groups’ , and Tamika moved us to discussing the potential of the upcoming kids telethon, in particular their ability to engage the participation of national actors to highlight certain health issues in ways that would seem very appealing to the countries’ youth. For instance, Home and Away, a highly popular Australian soap opera watched by aboriginal and non-aboriginal youth alike, could serve as a highly engaging and effective vehicle for sharing key health messages and knowledge. I think we left Janice feeling inspired, and considering some avenues for knowledge exchange that she could explore.

Close the Gap between Policy and Practice. In the policy realm, I met with the Acting Commissioner for Children and Young People and colleagues, and with clinical and policy staff at the Mental Health Commission of Australia. I spent a half day with Dr. Simon Davies and his team at CAMHS, teaching them about KT, and discussing our work supporting Ontario’s outcome initiative with CAFAS. By the end of my journey, we were poised to collaborate on research exploring renewed potential of their own system wide outcome initiative using the HoNOSCA, which while collected national wide has never been shared back with clinicians or with consumers.

Some common challenges emerged from our meeting with local CAMHS clinicians in Broome, who struggle with local application of policy in very difficult to serve communities. There typical model of service delivery is family structured individual therapy, which is difficult to implement in their rural, remote, and indigenous context. “They try hard to deliver the model, but the more rural you go, the less feasible that is. This is sole-destroying for the clinician, as there is no model that really fits the context.” In the first clinical session with the youth, the goal is to make a plan, write it down, and review a leaflet on rights and responsibilities; this is simply not feasible to accomplish at the first meeting in the absence of establishing a trusting therapeutic relationship. There is variable clinical supervision, which typically only involves case review, and new tele-health services are often staffed by adult psychiatrists with little pediatric experience. These are key challenges for service delivery in child and youth mental health.

There is a need for more liaison, capacity support, therapeutic group models, and consultation that are context relevant and feasible. Indigenous youth in regions “live in the community, not within the family”, and you can’t provide service without consent, and you can’t get consent from parents who are absent from the home for long periods of time, or who don’t have the capacity to provide consent. This is a quagmire that requires collaborative problem solving and clinical input from the field to retrofit a policy for service delivery that is not working in the given context.

“Social change is not going to come from just knowing more information, but from doing something with it” Pia Mancini, TEDGlobal 2014. I hope it’s not too immodest to hope that perhaps I left behind a new sense of urgency, motivation, and a clear pathway for change.


Thank you to Healthway who provided the funding for my visiting fellowship to Dr. Tamika Heiden at KT Australia, and Dr. Lisa Wood at the School of Public Health, University of Western Australia.

Image: Melanie Barwick and Tamika Heiden, Cable Beach, Broome, WA

Cutting a New Path for Academic Recognition in Australia

KT and Scholarship (click here for slides)

On September 30th 2014 I was invited to address a distinguished group of senior leaders in academia, health, and philanthropy at an evening reception for the Institute of Advanced Studies at the University of Western Australia. This blog post is derived from my remarks that evening.

In 2003, The Commonwealth Fund Task Force on Academic Health Centers in the United States published a report (1) affirming that academic health centres have “enjoyed an important public trust as recipients of enormous amounts of public funding for biomedical research” (viii). One could extend this assertion to institutions of higher learning more broadly. As research institutes and universities continue to play a pivotal role in the biomedical revolution – and in the development of all areas of research – the pace of change continues to accelerate and opportunities to apply new knowledge and to manage knowledge more efficiently continue to grow. The report went on to acknowledge that these organizations will face new challenges, specifically that they will be called upon to demonstrate that not only are they efficient producers of new knowledge but that they can apply that knowledge effectively, partner with non-academic institutions, and accomplish their goals in ways that meet public expectations.

The scale, scope and dynamic nature of social and economic burdens also demand that these organizations excel in the sharing of research-based knowledge with all knowledge users; that is to say, with members of the public, with communities, with policy makers, in addition to scientists and academics.  This shift, from a paradigm of the old adage, “knowledge is power” toward a new vision, “sharing knowledge is empowerment,” (2) will enable us to make more significant contributions to the well-being of mankind; it will accelerate our impact, and build stronger connections to society – all of which will strengthen the relevance of our universities and academic institutions for our time.

In thinking about scholarship, academic promotion, and the role of knowledge translation in these processes, it is useful to consider how universities have evolved in the last couple of centuries, and for this account I acknowledge the work of Cheryl Maurana (3) and Ernest Boyer (4)According to Boyer’s, scholarship in the United States higher education system has progressed through three discrete, yet overlapping stages. In the first stage, the 17th century colonial colleges focused on building the character of students and producing graduates who were prepared for civic and religious leadership. Teaching was considered a vocation like the ministry.

This was the dominant perspective until the mid-19th century, where it evolved to the second stage characterized by universities’ growing focus on the practical needs of a growing nation. Universities saw themselves as having a direct role in supporting the nation’s business and economic prosperity, and in supporting the agricultural and mechanical revolutions. Education during this time was for ‘the common good.’ By the late 1800s, education was to be of practical utility and was about the application of knowledge to real problems.

The third stage of scholarly activity was characterized by an emphasis on basic research. Many scholars, having been educated in Europe, were intent on developing research institutions focused on research and graduate education, modeled after the German research universities. Emphasis on teaching undergraduates or providing service decreased. The Second Wold War accelerated the focus on research as the academic priority and Federal dollars were directed to universities for their scientific endeavors. After the war, this government support continued and the focus on scientific progress was firmly rooted.

This evolution of purpose has effectively narrowed the criteria for evaluating faculty scholarship, which means, it has shaped – and limited – what we do in these roles. Promotion and tenure depend on conducting research and publishing results. Equating research and publication with scholarship and promotion has resulted in some unintended and limiting consequences. Firstly, it has created a culture in which academia is disconnected from the real world problems of contemporary society and has failed to recognize the key role of non-academics in applying this research to address the growing number and complexity of social, economic, and environmental concerns. We’ve become detached from what’s going on out there, and we are only now re-engaging with communities and knowledge users in a way that necessitates a new paradigm for scholarship, and relatedly, and new way of assessing scholarly work.

In their report on Scholarship in Public (5), Julie Ellison and Timothy Eatman quote the President of the Association of American Colleges and Universities, Carol Schneider, describing her view of what’s wrong with the promotion system: “Picture a figure eight; a flattened figure eight, turned on its side.  The left-side loop represents the academic field—with its own questions, debates, validation procedures, communication practices, and so on. The right-side loop represents scholarly work with the public—with community partners, in collaborative problem-solving groups, through projects that connect knowledge with choices and action.  Our problem is that scholarly practice is organized to draw faculty members only into the left-side loop. The reward system, the incentive system, our communication practices—all are connected with the left side only. Work within the right-side loop is discouraged, sometimes quite vigorously. Our challenge, then, is to revamp the terrain so that the reward system supports the entire figure eight, and especially scholarly movement back and forth between the two loops in the larger figure. Left-loop work ought to be informed and enriched by work in the right-side loop, and vice versa. Travel back and forth should be both expected and rewarded” (page x).

A second limiting consequence is that we have developed tunnel vision regarding who we share our findings and discoveries with. The predominant methods for sharing what we learn is through peer-reviewed publications and conferences, both of which are knowledge translation strategies uniquely targeted to academic audiences. We do this because we want to contribute to the building blocks of scientific and academic discovery, but it has the unintended consequence of building walls between us and everyone else out there who could benefit from our work or who could contribute to solutions. In the words of Jonathan Lomas, it is no longer sufficient to be the authors of cutting edge research because in its raw form, “research information is not usable knowledge” (6). To be truly poised to improve health and well-being requires that we manage our research knowledge base more efficiently – sharing our tacit knowledge within our organizations and sectors in which we work, as well as beyond our walls, with knowledge users who desire it and who are poised to act on it – with communities, service providers, educators, and government policy and decision-makers, and with the general public who are often times both the beneficiaries and benefactors of our research dollars.

A related consideration is that hoarding what we know presumes that only academics can solve the worlds’ problems. Rarely, however, are we in positions to implement our innovations and discoveries from where we sit, and so to ensure our work has an impact out there, we need to expand the audiences with whom we share our work and the partners with whom we conduct the work.  New evidence from open innovation crowd-sourced solutions like Innocentive, have shown us that difficult problems are solved more quickly when we can mobilize a large pool of expertise; they’ve also shown that solutions most often come from outside the discipline (7).

Whereas traditional innovation – writing papers, speaking at conferences, working in isolated cliques – occurs within the four walls of the enterprise and relies on internal experts, open innovation acknowledges that problem solvers and knowledge are widely dispersed and may reside outside the enterprise. We need to shift from traditional innovation which is often practiced by rigid cultures that focus too much on who solves problems, to embrace innovation in more open and collaborative cultures that focus on finding solutions to challenges. This means expanding how we define scholarship, how we document it, and consequently, how we behave as academics.

And it’s not just who we share with but how we share. Because writing a paper is not an effective way to get people to change what they do, we need to expand our knowledge translation competencies to include a range of strategies that can serve a variety of goals – everything from sharing knowledge, to building awareness, changing practice and behaviour, informing policy, and moving innovations to market for commercialization. These are all the purview of the modern day scholar. There is clear evidence that merely producing research and publishing in journals does little to change practice, and ultimately, does little to improve health and well-being. To have the impact we desire, science must ensure that research results translate to effective evidence-based practices in health care, in education, in social services. In the words of Jonathan Lomas, “excellence in research is laudable, but unless we can impact health and well-being, it presents an incomplete effort.” (8)

That a great many academics do little more than share with other academics using traditional pathways speaks to the fact that for the most part, we are only recognized for these efforts – that’s what counts as scholarship in academia. Traditional methods for translating research knowledge continue to form the basis for professional advancement through academic promotion. Consider this data I collected from scientists in the Child Health Evaluative Sciences program of our Research Institute in 2002 (9). Though a small sample, you can see where most of their knowledge translation activities lie.

We count indicators of scholarship such as number of grants, funding dollars, publications in prestigious journals, and presentations – such as this one, as well as excellence in teaching and mentorship and service to our institution. These indicators remain relevant, and to be sure, I am not suggesting to you that we need to dismiss with them. What I am saying is that greater balance is required with a new range of metrics – a new paradigm of what counts as scholarship, because our traditional academic metrics have proved insufficient for achieving impact and have not kept pace with the evolving role of the academic, or with the demands of our changing world. More precisely, the number of grants you receive or the number of academic publications you accrue does little to demonstrate that the wide range of knowledge users for whom your work is relevant, actually understand what you discovered or knew what to do with the knowledge you generated.

To borrow an expression from Chip and Dan Heath, counting grants and publications is TBU – true but useless (10), from an impact perspective. And therein lies the problem. Counting number of publications and grants is an easy way to capture our productivity relative to other academics – the left hand side of the figure eight. Documenting how our science has impacted real world problems and the well-being of others is less easy, but nonetheless entirely feasible, and we have a moral, ethical, and fiscal obligation to do so.

So how do we redefine scholarship? In 1990, Ernest Boyer was commissioned by the Carnegie Foundation for the Advancement of Teaching to examine the meaning of scholarship in the United States. Boyer assessed the roles of faculty in the US, and how these roles related to both the faculty reward system and the mission of higher education. And in this way, he created a new paradigm for scholarship that has been adopted by many institutions.

Boyer put forth four interrelated dimensions of scholarship:

  1. Discovery; the inquiry directed to the pursuit of new knowledge
  2. Integration: making connections among disciplines and adding new insights
  3. Application: asking how knowledge can be applied to the social issues of the times
  4. Teaching: transmitting knowledge but also transforming and extending it as well. What we would now conceptualize as teaching and knowledge translation.

Building on Boyer’s work, Charles Glassick (11) was charged to determine the criteria used to evaluate scholarly work, from which emerged a set of standards for assessment:

  1. Clear goals
  2. Adequate preparation
  3. Appropriate methods
  4. Significant results
  5. Effective presentation
  6. Reflective critique

A similar perspective comes from Diamond and Adam (12) . The activity requires a high level of expertise; The activity breaks new ground or is innovative; The activity can be replicated and elaborated; The work and its results can be documented; The work and its results can be peer reviewed; The activity has significance or impact

These standards form the basis of the model of community engaged scholarship.  “Community engaged scholarship is teaching, discovery, integration, application and engagement; clear goals, adequate preparation, appropriate methods, significant results, effective presentation, and reflective critique that is rigorous and peer-reviewed.“  It is reflected in scholarship that involves the faculty member in a mutually beneficial partnership with the community.”

Community-engaged scholarship overlaps with the traditional domains of research, teaching, and service and an approach to these three domains which is often integrative.  As illustrated in this figure (13), approaches such as community-based participatory research (CBPR) and service-learning (SL) represent types of community-engaged scholarship that are consistent with the missions of research, teaching and service.

Engagement differs from dissemination or outreach. Engagement implies a partnership and a two way exchange of information, ideas, and expertise as well as shared decision making and collaboration in the efforts required for implementation and uptake

How is this activity documented? Documentation is how a scholar presents evidence of her activities in a dossier; the evidence presented is about behaviours, activities, and qualities of that work.

For all of the reasons presented thus far, we have seen a shift in paradigm, mainly in North America, building on the work of Ernest Boyer and Charles Glassick. This shift has not been without its challenges. Science continues to be hierarchical, and as a social scientist situated in the health sciences research institute alongside basic scientists, I still encounter evidence of that hierarchy – but less so, and it is no longer openly sanctioned as politically correct in some academic circles, particularly as informed and visionary individuals take up leadership positions throughout the university.

There are also national challenges to sharing research. Consider a piece published in the Lancet Global Health in May of this year. In it, the author – Chris Simms, writes “Canada’s reputation is further undercut by its silencing of government scientists on environmental and public health issues: scientists are required to receive approval before they speak with the media; they are prevented from publishing; and, remarkably, their activities are individually monitored at international conferences.3 These actions have outraged local and international scientific communities. A survey done in December, 2013, of 4000 Canadian federal government scientists showed that 90% felt they are not allowed to speak freely to the media about their work, and that, faced with a departmental decision that could harm public health, safety, or the environment, 86% felt they would encounter censure or retaliation for doing so” (14).

Take a moment to reflect on your own bias – both personal and organizational. What stands as scholarly and counts for promotion in your institution?

  • Consider an investigator who studies pathways to care in early episode psychosis using qualitative methods, and publishes and presents her work to academic audiences (15).
  • And if her presentations are made in schools, to audiences of teachers and students, is that scholarship?(15)
  • And if she works with a choreographer to translate the main findings of her research to an original dance choreography and engages a troupe of dancers to communicate her research findings to students, is that scholarship? (15)

How many of you still have your hands up at this point in the story?

  • What about a scientist in pediatric pain who develops an app that encourages children to recount their pain experiences? (16)
  • What about a scientist who develops an animated vignette to share key research finding with practitioners; is that scholarship? (17)

Let’s look more closely at what this new paradigm of scholarship looks like. Different models have emerged from the work of Boyer and Glassick, variations on the standards proposed. And in the essence of time, I will review how we have operationalized this new scholarship in the Department of Psychiatry and the Faculty of Medicine at the University of Toronto (18).

In our model, traditional concepts of teaching, research and service have been replaced with a new scholarship paradigm encompassing learning, discovery, and engagement. We still assess excellence in teaching and research. But our new paradigm also includes scholarly activities defined as Creative Professional Activity. Creative professional activity is included in scholarly activities to be considered in promotion decisions. The Faculty of Medicine at the University of Toronto recognizes CPA under the following three broad categories.

Professional Innovation and Creative Excellence which includes the making or developing of an invention, development of new techniques, conceptual innovations, or educational programs inside or outside the University (e.g. continuing medical education or patient education). To demonstrate professional innovation, the candidate must show an instrumental role in the development, introduction and dissemination of an invention, a new technique, a conceptual innovation or an educational program. Creative excellence, in such forms such as biomedical art, communications media, and video presentations, may be targeted at various audiences from the lay public to health care professionals.

Contributions to the Development of Professional Practices. In this category, demonstration of innovation and exemplary practice will be in the form of leadership in the profession, or in professional societies, associations, or organizations that has influenced standards or enhanced the effectiveness of the discipline. Membership or the holding of office in professional associations is not itself considered evidence of creative professional activity. Sustained leadership and setting of standards for the profession are the principal criteria to be evaluated. Both internal and external assessment should be sought. The candidate must demonstrate leadership in the profession, professional organizations, government or regulatory agencies that has influenced standards and/or enhanced the effectiveness of the discipline. Membership and holding office in itself is not considered evidence of CPA. Examples of contributions to the development of professional practice may include (but are not limited to) guideline development, health policy development, government policy, community development, international health and development, consensus conference statements, regulatory committees, and setting of standards.

Exemplary Professional Practice is that which is fit to be emulated; is illustrative to students and peers; establishes the professional as an exemplar or role-model for the profession; or shows the individual to be a professional whose behaviour, style, ethics, standards, and method of practice are such that students and peers should be exposed to them and encouraged to emulate them.  To demonstrate exemplary professional practice, the candidate must show that his or her practice is recognized as exemplary by peers and has been emulated or otherwise had an impact on practice.

Community work becomes scholarship when it demonstrates current knowledge of the field, current findings, and invites peer review. The community work should be public, open to evaluation, and presented in a form others can build on.

So, what are the new metrics or criteria for assessing this type of scholarship?  They include:

  • Clear Goals, clearly stated and jointly defined by community and academics; that are developed in partnership and based on community needs; and reflect an issue that both community and academia think is important.
  • Adequate Preparation; evidence that the scholar has the requisite knowledge and skills to conduct assessment and implement the research, and has laid groundwork for program based on most recent work in the field.
  • Appropriate methods, developed with partner involvement, a feasible approach, and significant results.
  • Effective presentation (knowledge translation), with documentation that the work of the partnership been reviewed and disseminated in community and academic institutions; that presentations and publications have occurred at both community and academic levels; and that the results have been disseminated in a wide variety of formats appropriate for community and academic audiences.

Lastly, we need to see evidence of ongoing reflective critique – that the work has been evaluated; that the scholar thinks and reflects on her work; and we consider whether the community would work with this scholar again, and whether the scholar would engage the community again.

At the University of Toronto:

  • CPA may be linked to Research to provide an overall assessment of scholarly activity.
  • Contributions must be related to the candidate’s discipline and relevant to his/her appointment at the University of Toronto.
  • There should be evidence of sustained and current activity.
  • The focus should be on creativity, innovation, excellence and impact on the profession, not on the quantity of achievement.
  • There must be evidence that the activity has changed policy-making, organizational decision-making, or clinical practice beyond the candidate’s own institution or practice setting, including when the target audience is the general public.
  • Contributions will not be discounted because they have led to commercial gain, but there must be evidence of scholarship and impact on clinical practice.

Due to the variable activities included under CPA, there may be diverse, and sometimes innovative markers used to indicate the impact of the CPA. Evidence upon which CPA will be evaluated can include a wide range of activities; some of which you will recognize as more traditional ones but others that will seem novel to you.

Scholarly publications: papers, books, chapters, monographs
Non peer-reviewed and lay publications
Invitations to scholarly meetings or workshops
Invitations to lay meetings or talks/interviews with media and lay publications
Invitations as a visiting professor or scholar
Guidelines and consensus conference proceedings
Development of health policies
Presentations to regulatory bodies, governments, etc.
Evaluation reports of scholarly programs
Evidence of dissemination of educational innovation through adoption or incorporation either within or outside the university
Evidence of leadership that has influenced standards and /or enhanced the effectiveness of health professional education
Creation of media (e.g., websites, CDs)
Roles in professional organizations (there must be documentation of the role as to whether the candidate is a leader or a participant)
Contributions to editorial boards of peer-reviewed journals (including Editor-in-Chief, Associate Editor, and board member)
Documentation from an external review
Unsolicited letters
Awards or recognition for CPA role by the profession or by groups outside of the profession
Media reports documenting achievement or demonstrating the importance of the role played
Grant and contract record, including evidence of impact on activity of industry clients
Innovation and entrepreneurial activity, as evidenced by new products or new ventures launched or assisted, licensed patents
Technology transfer
Knowledge transfer

Elsewhere, other institutions refer to this type of activity as community engaged scholarship or community scholarship, to capture the products from active, systematic engagement of academics with communities for such purposes as addressing a community-identified need, studying community problems and issues, and engaging in the development of programs that improve outcomes.

The document referenced here (19; slide25) provides edited or distilled information from the websites of several institutions and entities that have recognized and seek to reward community-engaged scholarship (CES). Most are health science schools or departments. Three are not, one represents an entire university, one a social science department and the other a national body.  Good examples to explore.

Similar models are used elsewhere in North America; this is not a complete list (see slide 26).

There are very good resources describing these promotion practices, and illustrating how to document and assess them.

In their book on how to change things, Chip and Dan Heath (10) say that if you want things to change, you’ve got to appeal to both the rider – our rationale side, and the elephant – our emotional side. The rider provides the planning and direction, and the elephant provides the energy and motivation.

Many organizations have paved the way and what remains is for you to apply this in your own institution – find the feeling, grow your people and systems, and rally the herd. My goal today was to point out a new destination for scholarship and academic promotion that fully considers a range of knowledge users and engaged strategies for sharing academic work. I have highlighted some of the ‘bright spots’ (10) – successful efforts that are worth emulating, and I have scripted some of the ‘critical moves’ (10) or elements of this new paradigm by sharing how some universities have defined and operationalized them.

Rally the herd at the university and research institutes

Here’s what you need to do to get to your new destination:

  1. Build on the work I’ve described and the new models shared to develop better methods to evaluate promotion and tenure practices that are inclusive of community scholarship in your own institution. Key to this process is the development of specific descriptions for faculty who are involved in this type of work, whether you call it creative professional activity as we do, community scholarship, or community engaged scholarship. Develop definitions that are meaningful in your context, and include standards of assessment, products, methods of documentation, and examples of faculty CV. There are some examples available to get you started, as I’ve shown you, but likely need to be personalized to your own institutional context.
  2. Develop a national network of senior faculty in the field of community scholarship who can serve as mentors for other faculty and as models for junior faculty as they go forward for promotion.
  3. Cultivate and educate administrative leaders, senior faculty, and leaders of national associations and funding bodies to serve as champions for community scholarship and to advocate for policy change
  4. Seek out and disseminate toolkits developed in support of community scholarship.

Grow your systems. Research funders are also important drivers for change, for if academics are going to expand their scholarly work, it needs to be funded. In Canada, the Canadian Institutes for Health Research, and provincial funding bodies in Alberta and British Columbia have been important drivers for growth in KT and implementation science by way of tailored RFPs for science in this area that specifically request joint research leadership between nominated principal applicants and nominated principal knowledge users, and by allowances for KT activities in research budgets (not only for travel and open access publications).

Australasian funding bodies could broaden and deepen their appreciation of knowledge translation by moving beyond the clinical guideline, toward appreciating the multiple forms of partnership, engagement, and KT strategies that better exemplify the scope of activities designed to create real world impacts from the science it funds.  These changes are low lying fruit; they are entirely feasible and critical in order to realize real change in practice stemming from effective KT and partnerships.

Lastly, grow your people. The Hospital for Sick Children has two professional development opportunities that I’ve developed along with our KT team in the SickKids Learning Institute (20). Evaluations show they are highly regarded, valued, and effective for building knowledge and skills.

The Scientist Knowledge Translation Training course (SKTT) is run in Toronto and across Canada, and we’ve also taken it to the US, and Scotland. It’s a 2-day training course intended for health science researchers across all four scientific pillars (basic, clinical, health services, population health), who have an interest in sharing research knowledge with audiences beyond the traditional academic community, as appropriate and in increasing the impact potential of their research.  The course is also appropriate for KT professionals, policy and decision-makers, and educators, and researchers in other areas of science. I am running one this week here at the university, and we are looking to bring this course to Australia in partnership with KT Australia in 2015.

The Knowledge Translation Professional Certificate (KTPC) is a five-day professional development course and the only course of its kind in North America. The curriculum, presented as a composite of didactic and interactive teaching, exemplars, and exercises, focuses on the core competencies of KT work in Canada, as identified by a survey of KT Practitioners (21). This is one-of-a-kind opportunity for professional development and networking, and is run in Toronto three times a year with a larger faculty. It too is highly subscribed and very well regarded, and it’s also very hard to get into as it sells out in 15 minutes each time we open registration.

And so, in closing, I leave you with a new destination to contemplate but also with some concrete actions to get you started on your journey. Committing to these recommendations will require you to have courage to challenge the status quo, and be willing to look beyond traditional reward systems and take risks to redefine them. It will require crafting a shared vision, and the commitment to identify leading scholars who can serve as role models and mentors.  And I remind you that in the end, the number of grants we get, the number of research studies we do or the proliferation of publications produced matters little if they do not improve practice, health, and well-being. What we really want to get at is not how much research we have done, but how many lives are improved as a result of what we have accomplished.


(1) The Commonwealth Fund Task Force on Academic Health Centers. (2003). Envisioning the future of academic health centers. Web document: www.

(2) Rifkin SB & Pridmore P.  (2001). Partners in planning: Information, participation, and empowerment. London UK:  Macmillan Education Ltd, London.

(3) Maurana CA, Wolff M, Beck BJ & Simpson D.  (2001).  Working with our communities: moving from service to scholarship in the health professions.  Education for Health, 14(2), 207-220.

(4) Boyer EL.  (1996).  Scholarship reconsidered: priorities of the professoriate.  Princeton NJ: Carnegie Foundation for the Advancement of Teaching.

(5) Ellison J & Eatman TK.  (2008). Scholarship in public: knowledge creation and tenure policy in the engaged university.  Syracuse NY: Imaging America.

(6) Lomas J. Finding Audiences, Changing Beliefs: The Structure of Research Use in Canadian Health Policy. Journal of Health Politics, Policy and Law. 1990;15:525–42. [PubMed]

(7) Lakhani K, Bo L, Jeppesen P, Lohse A, & Panetta J.  (2007). The value of openness in scientific problem solving. HBR Wokring Paper.  From the web:

(8) Lomas J.

(9)  Barwick, M. (2002). Development of a knowledge translation strategy for population health sciences. Toronto, ON: The Hospital for Sick Children.

(10) Heath C & Heath D. (2010). Switch: how to change when change is hard. Toronto: Random House.

(11) Glassick, CE, Huber MT, & Maeroff GI.  (1997). Scholarship assessed: Evaluation of the professoriate. The Carnegie Foundation for the Advancement of Teaching. San Francisco, CA: Jossey-Bass, Inc.

(12) Diamond R. & Adam, B. (1993). Recognizing faculty work: Reward systems for the year 2000. San Francisco, CA: Jossey-Bass.

(13)  Commission on Community-engaged scholarship in health professions. Linking scholarship and communities.(2005).  Seattle WA: Community-Campus Partnerships for Health.

(14) Simms, C.D. (May 2014). A rising tide: the case against Canada as a world citizen.  The Lancet Global Health, Volume 2, Issue 5, Pages e264 – e265.

(15) Boydell, K.  (2011).  Using performative art to communicate research: dancing experiences of psychosis.  Canadian Theatre Review, 146.

(16) Dr. Jennifer Stinson, Hospital for Sick Children.

(17) Dr. Melanie Barwick, The Hospital for Sick Children

(18) University of Toronto Department of Psychiatry Promotions.  See:



(21) Barwick M, Bovaird S & McMillan K. (under revision). Building capacity for knowledge translation practitioners in Canada. Evidence & Policy.

Reflections on the Australian Implementation Conference 2014

AIC14 Gang

I spent this past week in beautiful Sydney, Australia, for the 2nd Biennial Implementation Conference. It was a great opportunity to learn about what’s happening across Australia in KT and implementation, and was attended by over 400 delegates. The conference was ‘successful’ by the first day, if only because it attracted a blend of researchers (33%), policy makers (24%), practitioners (19%) and others (24%).  Missing from the group, although there were a few, were Australia’s national health funders, who have an important role to play in supporting implementation science and the legitimization of KT and implementation work in academia.

The schedule was packed with perspectives of both practice and science in implementation, from a variety of speakers. Here are my take-aways from the conference.

Use Data

Many organizations, including health service delivery and government departments, have arrived at an understanding that they need data.  And, for the most part, a good many of them collect data about service outcomes, health, and well-being.  Problem is, they don’t use the data they collect in a meaningful wayDr. Fred Wulczyn, Senior Research Fellow at Chapin Hall, University of Chicago (USA) talked about how use of administrative data needs to be a ‘deliberative process’.   He encouraged us to ‘process the data, and process the narrative’ that emerges from the data.  We need to develop a consistency in leadership wherein there is a willingness to use evidence, as well as a capacity to generate evidence.  This resonated strongly for me, in the context of the data my team has produced for Ontario’s child and youth mental health sector over the last 14 years.  We produced numerous quarterly and annual reports, for service provider organizations and for government, but we were never convinced that the data were used for service planning; that providers or government policy makers actually took the time to review and reflect on the data and collectively arrive at a narrative of what the data suggested they should be doing to improve services.  These efforts represented an underutilized contribution; a missed opportunity for reflection and improvement in service delivery.

If we are to achieve optimal utilization of the data we go to great lengths to collect, we need to be deliberate in our approach, and this requires us to change how we use our time, which means we must stop complaining that we don’t have the time.  How we allocate our time in health services is a feasible change!  In the same way that implementation of evidence-based practices requires time for planning and reflection, so too, does the deliberate use of administrative data.  On the policy side, government access to evidence is not the only issue; they must review and reflect on the data they receive. The data won’t tell a story on its’ own; it needs to be reviewed and reflected upon by those overseeing and providing services.  At present, administrative outcome data sit gathering dust, much like all our reports, and represent unrealized potential for change. In the words of Atul Gawande, count something, write it down, and tell someone about it. Look for opportunities for change.  Good words to live by.

Let Go the ‘Cherished Notions’

Service providers need to drop the services that have no evidence to support them; what several speakers referred to as their cherished notions.  They can only know whether their services are effective if they examine their data, and equally, if they collect their outcome data on a routine basis to begin withIt is time to heighten our accountability to the children and families we serve, and to be honest about the effectiveness of the services we provide.  For their part, recipients of mental health and health services need to ask their providers for evidence that their services work.  We do no less when we to go to hospital for cardiac surgery, or when we purchase other services.  Somehow, this question is not being asked by families seeking services for their kids’ mental health.  We need to empower them to do so.

Governments also need to invest in system wide outcome measurement, something that has been accomplished in many countries and jurisdictions. Without this, there is no basis for tracking change, and thus, no way of systematically assessing whether changes to evidence based practices are having the desired effects system wide. This too, is highly feasible. One need only look to models of success in Ontario, Michigan, New York, etc., plan on the change, and implement it.  Data systems are the backbone of service improvement, and it’s a huge oversight to try to improve systems through implementing evidence-based care without first building the infrastructure to support these efforts.

In the end, said Wulczyn, it’s about getting better outcomes for kids, not about defending a program that you like or providing treatment that you believe works without actually going the extra step to assess your fidelity.  No matter how long they’ve been practicing, clinicians are not off the hook in demonstrating their impact on the clinical outcomes of the kids and families they serve.    Sadly, our current models of supervision are not holding us to this practiced reflection, and that’s not a good thing.

Link the Changes

Administrative changes in how we measure and use data will only take us so far, according to Fred Wulczyn; they only nudge the needle.  To achieve whole system change and improved outcomes we need to link administrative data with fiscal changes – incentives to do the right thing, and with evidence about how kids and families experience the service system. Without this linkage, each of these events represents only a chapter of the whole story.

The system wide changes we need to produce a more robust system were eloquently capture by Professor Brian Head, Program Leader in Policy Analysis at Queensland University. He identified 6 activities needed for building an effective policy system: collect data in a systematic and rigorous manner; utilize personnel with strong data analysis skills; improve institutional capacity to provide performance information and policy analysis of options; evaluate and review processes; open political culture and knowledge flows; and identify champions across all sectors to broker these changes.

Fund Partnerships and Change Academic Currency

A wonderful panel of women leaders were assembled to discuss the Australasian perspectives on solving complex policy problems for implementation of evidence, including Amanda Cattermole, General Manager Budget Policy Division, AUS Govt, The Treasury; Tricia Murray, CEO Wanslea Family Services; Sally Redman, AO, CEO Sax Institute, Maree Walk, Chief Exec, Community Service Division, NSW Department of Family and Community Services; and Clare Ward, Chief Exec, Families Commission, New Zealand.  They each shared their thoughts on how the implementation of evidence in practice requires partnerships between academia, government, and service delivery, and many had stories to share about how they’ve been involved with these efforts in their own jurisdictions.  Yet, they also said that for these partnerships to occur they have to be recognized as relevant by health funders and as scholarly and worthwhile pursuits for academics, and in many of Australia’s universities, they are not.  There are many universities, particularly in North America, that have adopted new formats and philosophies related to academic promotion, and have incorporated a new vision of scholarship that recognizes community engaged scholarship.  Australia needs to turn to these models and start the conversation for change.  I will be speaking on this topic at the University of Western Australia on September 30th, and will blog about my talk later this month.

A related and important driver for change are the health research funders.  In Canada, the Canadian Institutes for Health Research has been an important driver for growth in KT and implementation science by way of tailored RFPs for science in this area that specifically request joint research leadership between nominated principal applicants and nominated principal knowledge users, and  by allowances for KT activities in research budgets (not only for travel and open access publications). The National Health Medical Research Council and other Australasian funding bodies need to broaden and deepen their appreciation of KT by moving beyond the clinical guideline, to appreciating the multiple forms of partnership, engagement, and KT strategies that better exemplify the scope of activities designed to create real world impacts from the science it funds.  These changes are low lying fruit; they are entirely feasible and need to be acted upon if Australia is to realize real change in practice stemming from effective KT and partnerships.

Build the Triangle: Research on Implementation Drivers

Exciting work is happening to build on key innovations in implementation.  In particular, we have added to NIRN’s implementation triangle through our own CIHR funded research on KT in child and youth mental health, noting the emergence of supervision models as a key competency driver.   Bianca Albers described her doctoral work to develop the bottom of the triangle and expand on the role of leadership in driving implementation success.  This is an important and rather huge undertaking, given that the role of leadership in implementation pursuits is often poorly articulated and defined.  I know I will be following her work as it evolves, watching for new developments on this key driver.  Lastly, Jennifer Schroeder of the Implementation Group and Allison Metz from NIRN are doing interesting work on collective impact that looks promising for advancing our understanding of the right side of the triangle, the organizational drivers.

At the conclusion of the conference, I felt invigorated but also well aware that we have only touched the tip of the iceberg and have much collective work to do in this field as it continues to evolve and impact services.  As I look forward to the following four weeks in Perth, Broome, and Albany, the perspectives, challenges, and innovations highlighted at AIC2014 will percolate in my mind, and I look forward to sharing again as my Australian journeys in implementation continue.

Meetings, Trainings, Collaborations… My Australian Schedule


Monday Sept 22 2014

  • Meeting with Healthway funder
  • 1/2 day Professional Development Workshop for Act, Belong, Commit

Tuesday Sept 23 2014

  • 1/2 day Professional Development Workshop for Act, Belong, Commit

Wednesday Sept 24 2014

  • Travel to Broome, WA

Thursday Sept 25 2014

  • Broome Public Seminar
  • Meeting, Kimberley Public Health Unit, Broome WA

Friday Sept 26 2015

  • Meeting with Medicare Local, Broome WA
  • Meeting with Nirrumbuk Aboriginal Corporation
  • Meeting with WA Rheumatic Heart Disease Register & Control Program
  • Meeting with Alive & Kicking Goals


Monday Sept 29 2014

  • PD Day, Aussie Optimism, Curtin University

Tuesday Sept 30 2014

  • Seminar, School of Population Health, University of Western Austrlia
  • Meeting, Infant Mental Health Group, Mental Health Commission
  • Presentation and Reception, IAS, UWA

Wednesday Oct 1 2014

Thursday Oct 2 2014

  • Scientist KT Training, UWA

Friday Oct 3 2014

  • Scientist KT Training, UWA


Monday Oct 6 2014

  • Meeting, Giving West, Perth
  • Meeting, Edith Cowan University – Institutional Capacity Building for KT

Tuesday Oct 7 2014

  • Lunch discussion, Commissioner for Children and Young People
  • Presentation to Australian Society for Research Managers (ARMS) WA Chapter

Wednesday Oct 8 2014

  • Meeting with Dr. Hugh Dawkin’s team, Office of Population Health Genomics, Dept of Health Gov of WA
  • Roundtable discussion, Human Capability Group / Dr Donna Cross, Telethon Institute for Child Health Research

Thursday Oct 9 2014

  • Public Seminar, AHPA and PHAA
  • Think Tank meeting, Aussie Optimism, Curtin University

Friday Oct 10 2014

  • 1/2 day workshop, Child and Adolescent Mental Health, CCYP offices


Monday Oct 13 2014

  • Lunch meeting, Centre for the Built Environment, UWA

Tuesday Oct 14 2014

  • fly to Albany WA
  • Public Lecture, Albany WA

Wednesday Oct 15 2014

  • Working breakfast, Albany WA
  • Meeting with Rural Clinic School, UWA Albany

Thursday Oct 16 2014

  • Lunch with funder, Healthway

Blogging as a Scholarly Activity: Cutting a New Path for Academic Recognition

I have thought about blogging ever since I ventured into the virtual world of social media.  I’ve been Tweeting since January 2009 and have found Twitter to be a very useful format for (1) getting the word out about things that interest me, whether I’ve produced them or not (always with proper credit and attribution), and (2) for finding useful nuggets of information that I can ingest and utilize in my own work or simply share with like-minded folks (as before, appropriately attributed).

Engaging on Twitter has been a very rewarding use of my time, and I believe, constitutes a scholarly activity in that it has helped me to disseminate my academic outputs much more widely that I could have done using traditional academic channels (not everything I produce lends itself to a scholarly publication in a peer reviewed journal). Moreover, Twitter has fed my mind and leveraged my innovative spirit, and connected me with tons of people (through followers and their followers, in turn) that would normally have been out of my reach through traditional academic channels.  The most recent stats for My week on twitter: 6 New Followers, 3 Mentions, 2.11K Mention Reach, 8 Retweets, and 621 Retweet Reach. Not hugely prolific by social media standards, but given my 884 Twitter followers, I am fairly confident that this amounts to many more people than would have access to or will have read any of my peer reviewed publications… just sayin’.

In the academic world, my foray into Twitter and blogging has pushed me to the ‘cutting edge’ and positioned me rather ahead of the curve amongst my scholarly colleagues.  This is both good news and bad, aka challenging.  Good, for the reasons stated above, and challenging because the onus is on me and my social media savvy colleagues to present a strong case for why this ‘activity’ should be recognized as scholarly, by demonstrating credible indices of our social media impacts.

Others have pondered the value of social media for academia. Academic involvement in blogging is on the rise but not yet considered standard academic practice, as discussed by Achilleas Kostoulas in a recent LSE ISE blog post.  Kostoulas believes that “the openness and equality of blogs is fundamentally more democratic than other forms of scholarly debate”, and he reflects on why we might choose to blog academically, what we should blog about, how much time it realistically takes to engage in blogging, and how to stay out of trouble.

I am cautious to blog meaningfully, within a focused topic, and with a specific readership in mind.  More salient for me in everything I do outside of writing a paper for a peer reviewed publication is to ensure that I collect credible and useful evidence of my impact.  I spend equal (or more) time in the non-peer reviewed dissemination space, developing tools and resources for practice based on my science, and sharing on social media, and have yet to be academically penalized for it (although I do think about the potential academic costs every time my performance is reviewed).  I have growing confidence in my conviction that impact indicators (altmetrics) are among the best metrics of scholarship, and they are giving journal impact factors a run for their money (see here for more on that argument: “The Impact Factor and Its Discontents“).

I feel somewhat blessed in that my academic institution, the University of Toronto, recognizes what they term ‘creative professional activity‘ (aka community engaged scholarship, knowledge translation) as scholarship worthy of consideration for academic promotion.  This practice is evident at other universities (see here for Canada – and represents an emerging movement.  The Creative Professional Activity Committee for the Department of Psychiatry at the University of Toronto, which I now chair, has a goal of promoting the U of T model and encouraging other universities to evolve with the times. I plan to share this model when I visit Australian universities in the Fall.

The challenge brought about by social media in academia, and indeed, the challenge of science and academia in the present day, is to demonstrate how we impact and engage a variety of knowledge users with our scholarly outputs.  Impact has to do with showing that people knew what to do with the knowledge you shared, and the task befalling the communicator is to capture that reality in a tangible way.  This needn’t only be reflected in quantitative data as is typically available on GoogleAnalytics.  Compelling narrative about the relevance and utility of research based outputs should be captured from the perspective the knowledge users with whom we are sharing our research findings.

I will return to this topic and share my ‘impact indicator methodology’ in the next year, when I embark on preparations for compiling my own academic dossier for promotion to Full Professor.  In the meantime, here are some useful resources that I often share in my professional development trainings and that you may find useful:

Happy Tweeting and Blogging to all!

Invitation to My Journey Down Under

On September 13th 2014 I will land in Sydney Australia, marking the beginning of my journey to bring my experiences in knowledge translation (KT) and implementation science (IS) Down Under!  First stop is the 2nd Biennial Australian Implementation Conference where I have been invited to keynote.  The program for the conference looks fabulous – filled with both research and practice lessons in implementation from Australia and beyond – and I’m really looking forward to participating! I will be running a pre-conference workshop on Tuesday September 16th called Building impact with knowledge translation and implementation: A practical approach to bridging the know-do gap – more information and registration details here:

I’m looking forward to exploring Sydney and to meeting formally and informally with colleagues who will be congregating in this terrific city for the conference.  I know my host, Dr. Robyn Mildon, will have me busy and having fun!  Lined up already is a meeting with Angela Dee, Project Manager for the Wobbly Hub & Double Spokes Project and good colleague of my buddy Dr. Katherine Boydell  who is doing some fascinating and innovative research in Arts-Based KT.  I’m also planning to connect with Dr. Sue West, Associate Director for the Centre for Community Child Health at The Royal Children’s Hospital and Co-Research Group Leader for Policy, Equity and Translation with the Murdoch Children’s Research Institute. I’m free to meet with KT and IS folks on the Monday and Friday of that week (Sept 11-15), so get in touch if you’d like to meet! (

And so, the Raison d’être for this blog is to take you on this journey with me!  I will meet interesting people and learn about KT and implementation in the Australian context, and I’d like to document this adventure in a way that will facilitate participation from my Canadian and Global implementation network.  This forum will allow me to share my learnings, my perceptions, and to connect my network (all of you who are following) with groups in Australia that may be of interest.  And so, I invite you to comment throughout on all my forthcoming posts.  To the extent that my schedule will allow, I would also be happy to make connections with folks here with whom you may already be connected and that could spur an interesting alliance or collaboration (my schedule to be posted shortly).

I will arrive in Perth, Western Australia on September 20th to undertake a Healthway Health Promotion Visiting Scholarship, awarded to Dr. Tamika Heiden of KT Australia and Prof Lisa Wood at the University of Western Australia. This award will have me touring Perth, Broome, and Albany to give workshops, academic and public talks, and informal conversations with a range of people across public health, health, and mental health sectors. This is an important time for KT and IS in Australia, as they are on the cusp of really developing KT in practice settings and are still (I believe) coming to an understanding of KT and how it can be relevant for them in their practice, and how they can build KT into their organizational frameworks.  I will have an opportunity to meet with high level government and university people, and will be able to share our University of Toronto model for Academic Promotion which acknowledges scholarly work in KT and community engaged work – something I know will be rather foreign to Australian universities.

I hope you will join me on this adventure Down Under!