12 October 2015
Author:Senior Lecturer (Institute of Biodiversity Animal Health and Comparative Medicine), University of Glasgow
On September 28, The Conversation published an article: “Don’t fall for the deep-sea scaremongers – wild fishing is healthy and sustainable” by Magnus Johnson, a senior lecturer in Environmental Marine Biology at the University of Hull. The article criticised a paper by marine biologists at the University of Glasgow and Marine Science Scotland on the regulation of deep-sea fishing. The lead authors of the study, David Bailey and Francis Neat, respond here.
Since publishing our study on “A scientific basis for regulation deep-sea fishing by depth“ we’ve been subjected to criticism online and in print from fisheries organisations and most recently on this website in an article by Magnus Johnson. Johnson makes general points about the benefits of sustainable fisheries, that we agree with, but his specific critique of our work falls well wide of the mark.
Our work suggests that stopping deep-sea trawling at a depth of around 600m makes sense, because deeper than this the proportions of total and elasmobranch bycatch species (sharks and rays) in the assemblage increase significantly. At the same time indices of biodiversity are still increasing and the value of the species present falls.
Fisheries leaders and the author of the article claim that our study, being based on research survey data, is not representative of the effects of commercial fishing and, because bycatch is a “nuisance”, fishermen are able to avoid it. But what does the actual evidence say for deep-sea trawling? Our previous work showed that deep-sea fishing is unselective in its impacts on deep-sea fish. Unusually for a fishery, we were able to compare before and after deep-sea fishing in an area off Ireland. Fish numbers were cut in half in less than 20 years – and non-target species were just as likely to be depleted as targets. Any fish species whose depth range reached into the fishing grounds was affected.
As for the selectivity of recent catches, a collaborative project between the French fishing company SCAPECHE and the French government research organisation IFREMER looked at the options for being selective through changes to gear and by identifying areas of high discarding which could be avoided. They had little success in this endeavour. The modified trial gears caught as much bycatch as the normal gear and the authors dismissed as unfeasible the sort of work required to design the highly-selective gears used in shallow fisheries. There was little spatial pattern in most elasmobranch bycatch species, so no feasible avoidance strategy was possible for these species. The authors concluded that a depth-based avoidance strategy was as likely to succeed as other more complex spatial measures.
For now at least there is little evidence that deep-sea trawling is highly selective. As a result, any method that shows trends in what species were available to be hit by the trawls would provide a fair representation of the trends in impact of commercial fishing at different depths. Remember, it is the trends with depth that are the issue, not whether one net catches more than another. To disprove our study our critics would need to show that not only is commercial fishing very much more selective than surveys, but that they get relatively more selective with depth. Neither Johnson nor our other critics has provided any evidence for this.
Johnson further argues that our study was flawed because we failed to analyse any effect of time over the period of the study. Actually we have already done temporal studies in both the Irish and Scottish datasets that indicate following the initial depletion of stocks, the populations have been generally stable, thus showing little sign of recovery.
The criticism that we used “pseudo-commercial nets rather than data from fishing boats” – and that this invalidates our results – would be extremely weak in any case. The scientific trawls are modified commercial nets with finer mesh in the cod end (the part where the fish are ultimately collected after being herded into the net) and therefore catch a wider range of fish sizes than commercial nets. This will influence indices of biodiversity, but will not affect retention of the larger species that contribute most to the biomass indices or catches of sharks for instance.
One of the gears (Jackson Trawls of Peterhead model BT195) is identical to commercial fishing gears used by Scottish vessels targeting monkfish. The Scottish monkfish survey was specifically developed together with the fishing industry so that direct comparisons between survey vessels and fishing vessels could be made. Despite variation in gear type the trends in the indices were not significantly different. This is all set out in the paper or the many works underpinning it. It is little surprise then that our study also shows a very similar pattern of species richness with depth to those recordedfrom commercial trawls by on-board observers.
It is also worth noting at this point that the Scottish and French fishing industries had already agreed to an 800m limit before our paper came out – so now the question is whether this was the appropriate depth to choose. Our paper suggests not, because it demonstrates that trawling at depths beyond 600m the detrimental impacts on the fish community become increasingly adverse; an 800m limit would not be precautionary and risks continued ecosystem degradation.
A common argument put by industry is that this will be the “thin end of the wedge” and that NGOs will soon be back asking for the 200m and 400m limits for which they originally campaigned. We can say now that our study would not support the ban being moved shallower than 600m and would argue strongly against any NGO that proposed this. We follow the evidence – supported by the methodical collection of research data going back decades. Now we just want the science to be used.
This article was co-authored by Dr Francis Neat of Marine Sciences Scotland.