Change in research and in journal publishing
We fired questions at editors of British Psychological Society journals…
27 June 2024
'We need to get better at making research accessible to lay, practice and policy audiences'
Professor Shelley McKeown Jones (University of Oxford) and Sammyh Khan (Örebro University) are editors of the British Journal of Social Psychology (BJSP).
What has changed?
A lot has changed but if we had to pinpoint four things, we'd suggest the following:
Method expectations: several journals in the field, including the BJSP, have shifted away from the publication of single-study cross-sectional quantitative designs and those with primarily university student samples (unless ecologically meaningful). This has been propelled by the replication crisis and subsequent improvements to the rigour, robustness, and transparency of research in the field, as well as diversification of research samples.
Topic focus: there has been a growing emphasis within the BJSP and other outlets on carrying out research that connects to context and considers implications not only for research but also for policy and practice.
Open research: engaging with journal practices that support open research such as by encouraging authors to make data and materials available as well as through publishing registered reports has become more normative within and beyond the BJSP.
Diversity: although the BJSP is known for its epistemological and methodological openness and breadth, the current Editorial Board has been further diversified in terms of the career status, gender, ethnicity, and geographical locations of its members. This ensures greater inclusiveness and professional development opportunities for a broader range of rising and prolific researchers in the field, but also a greater capacity to handle a more diverse range of perspectives.
What still needs to change?
Again, lots but let's focus on three main points:
Stronger reach beyond science communities: one thing that social psychology is quite poor at, relative to some other disciplines, is engaging with audiences outside academia. To help address this, we recently introduced a new impact-focused article type at the BJSP called Perspectives.
This is just a start and we need to get better at making research accessible to lay, practice and policy audiences through other approaches. For example, by making articles available to those working outside of academia as well as offering a more comprehensive social media and online offering that better showcases what social psychology has to offer for understanding the world around us – which is a lot, actually.
Recognising the work of our EB and reviewers: the peer review process relies primarily on free labour, but this is not sustainable. Associate Editors are compensated a nominal fee but that it doesn't fully recognise their work, and editorial consultants and reviewers do not receive anything. We could do better at compensation when workloads are so heavy for academics. Payment could be one way of doing this but other incentives such as waiving conferences or Open Access fees are other options.
Shifting norms of excellence: the BJSP is known as an outlet that embraces social psychological research in all its forms. Work is still needed, however, to create stronger norms across social psychology more broadly that ensure that we value both basic and applied methods and that we continue to tackle contextually relevant issues and don't shy away from the messiness of real-world data in search of 'clean' stories.
What are the barriers to that change?
There are two main barriers that we feel are preventing a greater rate of change (or any change at all):
Profit-focused publishing models: one of the biggest challenges we face is that current publishing models, across the board not just within the BPS portfolio, are focused on profit-making. There are many implications of this but let's consider two in relation to our earlier points. First, charging high journal subscriptions and article publishing fees makes science inaccessible to those who might benefit most from it i.e., lay, practice and policy audiences, and this goes against the impact agenda more broadly.
It's true we're moving forward with open access – 68 per cent of all articles published in BPS journals in 2023 were, and through Wiley's Transformational Agreements researchers at over 2,700 institutions worldwide (including all UK universities and an increasing number of institutions across Africa, Asia and the Middle East) can now publish in BPS journals at no charge.
But we do worry that such agreements currently favour the better-off institutions and therefore a certain type of science. Second, while within the BPS there's income going back into supporting charitable objectives, within the ecosystem as a whole there are a lot of people carrying out free labour whilst publishers and organisations make significant profits – this is ethically problematic.
Publication incentive structures: a further challenge is that there remains a somewhat narrow view of what counts as 'excellence' in social psychological research. For example, that experiments are the gold standard, that 'neat' stories are what we need and that what matters most is the impact factor of the journal that you publish in.
This is all tied up in how research is evaluated internally (e.g., for promotion cases within universities) and externally (e.g., for grant applications and awards) and has implications for individuals and their career progression. It is good to see some universities explicitly recognise open research and impact in promotion criteria but the norms in the field about what counts as excellence still need shifting.
What else may change, whether that's desired or not?
There is no doubt that we have experienced a rapid shift in research and publishing practices over the last few years in social psychology, most likely attributable to the replication crisis and our desire to promote and ensure more transparent research practices. It is inevitable that open research approaches will become even more normative and continue to push forward thinking around rigour and transparency far beyond the traditional experimental design.
In fact, it already is; a shameless plug here goes to our recent BJSP special section on Qualitative Open Research. At the same time, we do think there is a risk that by not moving with the times and holding archaic practices that people will stop publishing in traditional journals like the BJSP because there seems to be greater flexibility and openness in publishing alternatives.
Personal reflections
We joined the BJSP as Co-Editors in Chief in late 2022 and it's been an incredibly busy year and a learning curve. We have recruited a fantastic team of Associate Editors and Editorial Consultants, revised our aims and scope in line with our collective vision, commissioned three special issues, implemented a new article type and of course, handled a huge number of submissions since the start of our tenure.
The journal has gone from strength to strength, thanks to excellent previous leadership and the supportive team we have around us. In particular, we are privileged to work with an Editorial Board that makes the journal what it is, reviewing and making decisions on the many papers we receive and of course, handling appeals and managing expectations and relationships - all under a tight timeframe with no or little financial reward.
We don't always agree with how the journal runs and the various pressures we and the team are put under but we both have great respect for the BJSP, it's the go-to journal for embracing social psychology in all its forms and this is something that we are proud of and want to maintain. There are lots of things that we would like to change, not everything is possible in a 3-year term, but we will give it our best shot.
'We need to think differently about how we train doc students'
Professor David Putwain is a Chartered Psychologist and Fellow of the British Psychological Society, at Liverpool John Moores University, and Honorary Professorial Research Fellow, at the University of Manchester. He is Editor of the British Journal of Educational Psychology.
What has changed?
There have been immense methodological and analytic developments in the past two decades as software as become more powerful and widely available. Studies based on quantitative naturalistic data, frequently now make use of longitudinal data collection, experience sampling methods, latent variable modelling techniques, multi-level structuring of data, separating within from between person variance, and so on.
There are two issues here. First, is the challenge of researchers staying up to date with new developments and being able to effectively train doctoral researchers in these methods. A related issue, from a journal perspective, is finding reviewers with sufficient expertise to be able to review these articles. Second, is that designs and analyses that were commonplace among the top internationally recognised journals 15 or 20 years ago (e.g., regression analyses based on cross-sectional data) are no longer published. The standards required of quantitative researchers and doctoral students are becoming higher if they wish to publish in the top tier of journals. This is equally true of the BJEP as for others.
What still needs to change?
We need to think differently about how we train doc students. Rather than the prevalent model used in the UK of a student pursuing an individual topic on a three-year programme it would be better to use the model adopted in some European countries of a four-year programme with taught modules in advanced design and analysis. In addition, similar to the model adopted in some fields of health and science, the student joins in with project teams in order to access the types of data and design that are difficult for a single individual to collect either on their own or if squeezed into a three-year programme (which makes immense challenges for longitudinal data collection). Funding for doctoral and post-postdoctoral positions needs to be massively increased in order to keep step with other leading research nations who are far more generous with the opportunities offered that we are in the UK.
What are the barriers to that change?
Realistically, I can't see any of these types of changes likely in the near future. Some of this is down to mismanagement of the HE sector from successive governments who have attempted to drive market forces into education and shape a narrative around employability which naturally focuses on undergraduate education. In addition, I would argue that given the high international standing of UK-based HE research compared to the relative level of low funding, that the funders take this position for granted. It won't likely continue this way given the priority to research funding afforded by international competitors. Once gone, it will be impossibly expense to claw back. It will gone for good. However, within the HE community, research leaders, university managers, and teaching staff, must also share responsibility for a collective lack of vision to drive forward doctoral training in ways required to equip the researchers of tomorrow.
What else may change, whether that's desired or not?
The risk is that educational psychology research becomes left behind by other nations that have much better doctoral and post-doctoral funding streams, and much better funding for research in general. While everyone would agree the obtaining research funding is difficult, my experience from reviewing for international funding agencies, and both applying for and reviewing for UK-based funding streams, it is much harder to obtain funding for educational psychology research in the UK than elsewhere. I could speculate on the reasons, but ultimately they are nothing more than speculations. What we can conclude with more certainty, is that it adds another barrier to the growth of UK-based educational psychology research.
I am keen to ensure that UK-based educational psychology research maintains an international presence. In the past 10 years or so, the number of UK-based undergraduate and masters' courses specialising in the psychology of education have increased, which will result in new career opportunities and throughput of graduates. Some of these graduates will continue to doctoral level study and beyond. This development is healthy and bodes well for the future. If I can make a small contribution by maintaining, and ideally strengthening the position, of the BJEP then all the better. That was my motivation for applying to become editor.
'One of the big problems is the high number of journals'
Costanza Papagno is Editor of the Journal of Neuropsychology
I think the main change is the huge number of journals. There are at present too many journals, some of very low quality, but this implies that there is also a huge number of papers that need to be reviewed and people refuse to perform reviews in the absence of any reward. Therefore, it happens that manuscripts are reviewed by very young (and without experience) researchers, reviews are often of very low quality, and sometimes it is absolutely clear that the reading of papers has been very superficial and inaccurate.
Authors must wait months before having the first revision done, sometimes asking for further experiments that cannot be performed because patients are no longer available, researchers have changed institutions, etc.
Therefore, in my opinion, one of the big problems at present is the high number of journals and the review process. If Journals do not find a sensible reward for reviewers, obtaining accurate and 'on-time' reviews will become more difficult. And this is very dangerous, considering the presence of chatGPT.
Registered reports in my experience are rarely considered. In two years I have seen in my journal only one of them and it was extremely difficult for the authors to obtain approval due to the incredible number of statistical constraints posed by the reviewers.
'We need to have a serious conversation about who our samples represent'
Harriet Tenenbaum and Dawn Watling are Editors of the British Journal of Developmental Psychology.
What has changed?
Studies are pre-registered, which should reduce data mining. As a result, the field should be able to move forward in terms of theory. At the same time, exploratory analyses can lead to new and exciting findings.
Powered samples sizes. For Developmental Psychology, often when studying particular groups of participants, studies were underpowered. We have been making changes to reflect on balancing rigour with accessibility of participant groups. We have seen researchers beginning to use large-scale collaborations, such as with the Many Babies studies to support recruitment and research to address core research questions across the globe.
Registered reports are still rare in our field, but the British Journal of Developmental Psychology accepts registered reports. At first, both editors served as reviewers plus an additional reviewer. Now we are more comfortable and solicit reviews from two reviewers. We also accept registered reports after data has been collected but before the data are analysed.
We support the increased engagement with citizen science, co-creation, and collaborative design. These activities will strengthen our research work and impact.
What still needs to change?
We continue to need more work conducted in the Global South from an emic perspective. We also need collaborations with researchers from the Global South. This research will help us put development in context. Reviewers need to be mindful that sometimes research from the Global South will not have large samples because of difficulties with funding or recruitment.
We also need samples that are more representative of the UK. Frequently, child samples are recruited from white middle-class children, which does not represent the UK. We need to have a serious conversation about who our samples represent. We need to find ways to decolonise our research.
We need to start engaging more schools, parents, communities to develop an understanding of our research and to broaden participation (increase response rates).
What are the barriers to that change?
It remains difficult to access child participants because of Covid. Research is starting to pick up again in the developmental journals, but it has been slow.
We need to train and collaborate with researchers outside Europe, Australia, and North America. We need opportunities to engage with a wide variety of people from different backgrounds to make these changes.
'Open science myths are key ideological barriers'
Fuschia Sirois and Andrew Thompson are editors of the British Journal of Health Psychology.
The growth of the Open Science/research movement in recent years has had important implications for research practices, as well as how we publish and share research. Arguably, this movement (at least within psychology) was born on the back of the replicability crisis, and concerns about the scientific integrity of failed replication attempts, and in extreme cases, scientific misconduct including falsification of data. The critical stance that underlies this movement has driven both transformation and debate about how to make psychological science more robust, transparent, and accessible beyond the academy.
As a result, we've seen the emergence of potentially game-changing research practices such as registered reports, open data, pre-registration of research studies, and open reviewing, alongside a surge in the number of open access (OA) publications. At BJHP, we've witnessed growth in the number of registered reports submitted, and Data Availability Statements have become the standard for BPS journals. Our OA content also continues to grow. For example, in 2023, the majority of the total content published in BJHP was OA, largely reflecting the many transformative agreements that our publisher, Wiley, now have in place with library consortia (including JISC in the UK) and institutions globally.
Whilst such agreements help address a key barrier to OA publishing, namely funding, it's not the only barrier to promoting and sustaining a more open research culture. As with any culture change, there can be both practical and ideological barriers that can operate at both an individual researcher level and at a systems level. Open science myths are key ideological barriers that can interfere with engaging in open research practices, until they are debunked.
One particularly stubborn myth that we've come across is about what open data means and its implications. The concept of open data is sometimes viewed in a binary way, as open versus closed, rather than as being on a continuum from fully open to, being available on request, to being functionally closed.
Other concerns focus on whether data sharing will impact the integrity of the data if it is used out of context, and the time and resources needed to prepare data to be shareable in an ethical, and protected manner. This latter concern is especially true for qualitative data which can require more processing and permissions to make it shareable.
Lastly, whilst research councils and funders have typically been instrumental in driving data sharing forward, some institutions practices may present barriers. For example, health and social care services are rightly focussed on protecting the data they hold or that is collected under their auspices, yet this needs to be balanced against what might be achieved with greater adherence to the open science framework.
There are also myths and misconceptions about pre-registration that can discourage its uptake. Whilst pre-registration of meta-analyses and experiments in particular has become mandatory for some journals (but not BJHP, yet), it is often not considered as appropriate for other research designs, such as observational and longitudinal studies, or even analysis of secondary data and qualitative studies. But if we are to move towards a more transparent and open research culture then ANY study design that includes the testing of hypotheses or/and investigation of specific issues is appropriate for pre-registration. And yes, it does take a bit of work, especially for studies with more involved and complex methodologies. But it is also important to consider that pre-registration frontloads the same work that would be done at later stages of the research process, and in doing so provides a clear roadmap for the execution of the research.
Pre-registration is one of those topics within the constellation of open science practices that has continued to spur debate, especially among advocates for the return of more discovery-based science practices. But from our perspective, pre-registration shouldn't preclude opportunities for exploration. Instead, pre-registration provides a vehicle to be transparent about the research process, including documenting possible exploratory analyses. And if that bolt of inspiration about a potentially interesting question to explore hits after the study is pre-registered, that shouldn't be a barrier to conducting additional analyses, with justification. What matters is being open and transparent about any deviations from the protocol when reporting a pre-registered study. At BJHP we will look favourably on submissions that are pre-registered and will be encouraging our reviewers to mindful of the content of these when reviewing.
As the open science movement shifts from replicability to generalisability, we anticipate that well-conceived and executed meta-analyses, systematic reviews, and meta-syntheses may become even more valuable. These approaches for synthesising evidence can provide glimpses into the gaps in the field that need to be addressed, as well as the contexts, samples, and methodological approaches that set the boundary conditions of the conceptual frameworks that guide our predictions. Ultimately, such insights have the potential to advance existing theory and spawn new theories, two goals which are of increasing importance especially within the field of health psychology.
One of the drivers for being Editors of BJHP was to ensure that BJHP remains in the vanguard of the Open Science/Access (OA) movement, and we have touched upon this in a recent editorial.
For FS, this goal resonates with a longstanding personal interest and practice in open science, and her current role as the chair of the Durham University Open Scholarship Working Group, which is tasked with articulating and implementing a vision for open scholarship at the university.
For AT, who has taught on research ethics and governance practices for many years and been a prior member of the British Psychological Society's Research Board, having an opportunity to further support the development of open science within the field of health psychology via editing BJHP is most welcome.
Key questions in research and journal publishing
What are the questions that you would like to see addressed, around research and practice in Psychology? Perhaps around incentive structures, funding, acessing representative samples, analysing data, getting published? Let us know, and perhaps we can begin those conversations and provide a platform to discuss and generate change. Reach out via [email protected] or tag us @psychmag on Twitter.
- Keep up with the latest research from British Psychological Society journals, and gain exclusive access as a member.